Activation Function:


 Sigmoid Activation Function:

* Widely used as it resembles probability with a value between 0 and 1.

tf.math.sigmoid(z)

g(z) = 1/(1+e^-z)



Hyperbolic Tangent:

 

tf.math.tanh(z)

g(z) = (e^z - e^-z)/(e^z+e^-z)




ReLU (Rectified Linear Unit)

* Piecewise non linearity with a single non-linearity

tf.nn.relu(z)

g(z) = max(0,z)





Whats the importance of an activation function? 

* Introduce non linearity

Comments