Sigmoid-weighted Linear Unit (SiLU) activation function, also known as Swish, computed element-wise: silu(x) = x * sigmoid(x) = x / (1 + exp(-x)).
silu(x) = x * sigmoid(x) = x / (1 + exp(-x))
swish() and silu() are both aliases for the same function.
swish()
silu()
Reference: https://en.wikipedia.org/wiki/Swish_function
Sigmoid-weighted Linear Unit (SiLU) activation function, also known as Swish, computed element-wise:
silu(x) = x * sigmoid(x) = x / (1 + exp(-x)).swish()andsilu()are both aliases for the same function.Reference: https://en.wikipedia.org/wiki/Swish_function