Activation Function

Activation Function#

Sigmoid

\[ \text{Sigmoid}(x)=\frac{1}{1+\exp(-x)} \]
image-20260330212319921

Softplus

\[ \text{Softplus}(x)=\frac{1}{\beta}\times \log{\big(1+\exp{(\beta x})\big)} \]
SoftPlus is a smooth approximation to the ReLU function and can be used to constrain the output of a machine to always be positive. And For numerical stability the implementation reverts to the linear function when \(\beta x>\text{threshold\}\)

image-20260330213208769

Relu

\[ \text{ReLU}(x)=\text{max}(0,x)=\frac{\rvert x\lvert+x}{2} \]
image-20260330213655996