Gaussion error linear unit (GELU) activation function.
This is computed element-wise. Currently jax-js does not support the erf() or
gelu() functions exactly as primitives, so an approximation is used:
gelu(x) ~= x * 0.5 * (1 + tanh(sqrt(2/pi) * (x + 0.044715 * x^3))).
Gaussion error linear unit (GELU) activation function.
This is computed element-wise. Currently jax-js does not support the erf() or gelu() functions exactly as primitives, so an approximation is used:
gelu(x) ~= x * 0.5 * (1 + tanh(sqrt(2/pi) * (x + 0.044715 * x^3))).Reference: https://ml-explore.github.io/mlx/build/html/python/nn/_autosummary_functions/mlx.nn.gelu_approx.html
This will be improved in the future.