πŸ“• Node [[rectified_linear_unit_(relu)]]
πŸ“„ Rectified_Linear_Unit_(Relu).md by @KGBicheno

Rectified Linear Unit (ReLU)

Go back to the [[AI Glossary]]

An activation function with the following rules:

If input is negative or zero, output is 0.
If input is positive, output is equal to input.

Loading pushes...

Rendering context...