Activation functions¶
Customized activation functions for supporting various models in ๐ค Diffusers.
mindone.diffusers.models.activations.GELU
¶
Bases: Cell
GELU activation function with tanh approximation support with approximate="tanh"
.
PARAMETER | DESCRIPTION |
---|---|
dim_in
|
The number of channels in the input.
TYPE:
|
dim_out
|
The number of channels in the output.
TYPE:
|
approximate
|
If
TYPE:
|
bias
|
Whether to use a bias in the linear layer.
TYPE:
|
Source code in mindone/diffusers/models/activations.py
74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 |
|
mindone.diffusers.models.activations.GEGLU
¶
Bases: Cell
A variant of the gated linear unit activation function.
PARAMETER | DESCRIPTION |
---|---|
dim_in
|
The number of channels in the input.
TYPE:
|
dim_out
|
The number of channels in the output.
TYPE:
|
bias
|
Whether to use a bias in the linear layer.
TYPE:
|
Source code in mindone/diffusers/models/activations.py
99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 |
|
mindone.diffusers.models.activations.ApproximateGELU
¶
Bases: Cell
The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.
PARAMETER | DESCRIPTION |
---|---|
dim_in
|
The number of channels in the input.
TYPE:
|
dim_out
|
The number of channels in the output.
TYPE:
|
bias
|
Whether to use a bias in the linear layer.
TYPE:
|
Source code in mindone/diffusers/models/activations.py
144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 |
|
mindone.diffusers.models.activations.SwiGLU
¶
Bases: Cell
A variant of the gated linear unit activation function. It's similar to GEGLU
but uses SiLU / Swish instead of GeLU.
PARAMETER | DESCRIPTION |
---|---|
dim_in
|
The number of channels in the input.
TYPE:
|
dim_out
|
The number of channels in the output.
TYPE:
|
bias
|
Whether to use a bias in the linear layer.
TYPE:
|
Source code in mindone/diffusers/models/activations.py
121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 |
|
mindone.diffusers.models.activations.FP32SiLU
¶
Bases: Cell
SiLU activation function with input upcasted to mindspore.float32.
Source code in mindone/diffusers/models/activations.py
36 37 38 39 40 41 42 43 44 45 |
|
mindone.diffusers.models.activations.LinearActivation
¶
Bases: Cell
Source code in mindone/diffusers/models/activations.py
164 165 166 167 168 169 170 171 172 173 |
|