Activation Functions in DL with Python code

Sadik Dange
3 min readJan 30, 2021

--

In a neural network, the activation functions are responsible for transforming the summed weighted input from the node into the activation of the node or output for that input.

Activation functions :

  1. Step
  2. Sigmoid
  3. tanH
  4. ReLU
  5. Leaky Relu

1.Step Activation Function

step function

From the image, we can clearly see step function only gives the output as 1 or 0 for the given input.

If the input value is less than 0 it will give output as 0 or else gives 1 as output.

Python Code:

def step(x): 
return 0 if x < 0 else 1

2.Sigmoid Activation Function

It is one of the popular and most used activation function for the output layer in a neural network.

sigmoid function

From the image, we can clearly find that the sigmoid function only gives a value between 0 and 1 as output. Using the formula

sigmoid equation

Python Code:

import numpy as np
def sigmoid(x):
return 1/(1+np.exp(-x))

3.tanh Activation Function

tanH

From the image, we can clearly find that the tanH function gives a value between -1and 1 as output. Using the formula

tanH equations

Python Code :

import numpy as np
def tanh(x):
return (np.exp(x)-np.exp(-x))/(np.exp(x)+np.exp(-x))

4.ReLU Activation Function

It is one of the most used activation function in the hidden layer of a neural network.

ReLU function and Equation

ReLU activation function simply returns the value which is maximum between 0 and the given input value.

Python Code :

def reLU(x):
return max(0,x)

5.Leaky Relu Activation Function

leaky Relu function and equation

Leaky ReLU is one attempt to fix the “dying ReLU” problem. Instead of the function being zero when x < 0, a leaky ReLU will instead have a small negative slope (of 0.01, or so)

Python Code :

def leaky(x):
return x*0.01 if x < 0 else x

The following graph is showing the difference between mostly used activation functions in artificial neural networks.

--

--

Responses (1)