ComfyUI Node: NNT Define Dense Layer

Authored by inventorado

Created

Updated

61 stars

Category

NNT Neural Network Toolkit/Layers

Inputs

num_nodes INT
activation_function
  • None
  • ELU
  • GELU
  • GLU
  • Hardshrink
  • Hardsigmoid
  • Hardswish
  • Hardtanh
  • LeakyReLU
  • LogSigmoid
  • MultiheadAttention
  • PReLU
  • ReLU
  • ReLU6
  • RReLU
  • SELU
  • CELU
  • Sigmoid
  • SiLU
  • Softmax
  • Softmax2d
  • Softmin
  • Softplus
  • Softshrink
  • Softsign
  • Tanh
  • Tanhshrink
  • Threshold
use_bias
  • True
  • False
weight_init
  • default
  • normal
  • uniform
  • xavier_normal
  • xavier_uniform
  • kaiming_normal
  • kaiming_uniform
  • orthogonal
  • sparse
  • dirac
  • zeros
  • ones
weight_init_gain FLOAT
weight_init_mode
  • fan_in
  • fan_out
weight_init_nonlinearity
  • relu
  • leaky_relu
  • selu
  • tanh
  • linear
  • sigmoid
bias_init
  • default
  • zeros
  • ones
  • normal
  • uniform
bias_init_value FLOAT
normalization
  • None
  • BatchNorm
  • LayerNorm
  • InstanceNorm
  • GroupNorm
  • LocalResponseNorm
norm_eps FLOAT
norm_momentum FLOAT
norm_affine
  • True
  • False
dropout_rate FLOAT
alpha FLOAT
num_copies INT
LAYER_STACK LIST

Outputs

LIST

INT

Extension: ComfyUI Neural Network Toolkit NNT

Neural Network Toolkit (NNT) for ComfyUI is an extensive set of custom ComfyUI nodes for designing, training, and fine-tuning neural networks. This toolkit allows defining models, layers, training workflows, transformers, and tensor operations in a visual manner using nodes.

Authored by inventorado

Run ComfyUI workflows in the Cloud!

No downloads or installs are required. Pay only for active GPU usage, not idle time. No complex setups and dependency issues

Learn more