Skip to content

[Machine Learning] Introduction of Tanh function

Last Updated on 2021-06-06 by Clay

Tanh

Tanh function, the formula is:

Basically, it is

sinh(x) / cosh(x)

the x value we input will mapping between [-1, 1].

And I wrote a simple code to display:

# -*- coding: utf-8 -*-
import matplotlib.pyplot as plt
import math

value = -10
x = [value+n*0.01 for n in range(2001)]
y = [math.tanh(v) for v in x]


plt.plot(x, y)
ax = plt.gca()
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.spines['bottom'].set_position(('data',0))
ax.yaxis.set_ticks_position('left')
ax.spines['left'].set_position(('data',0))
plt.show()



Output:

If you do not use Python’s built-in “math.tanh” instead use the defined formula to calculate:

def sinh(x):
    return (math.exp(x)-math.exp(-1*x))/2


def cosh(x):
    return (math.exp(x)+math.exp(-1*x))/2


def tanh(x):
    return sinh(x)/cosh(x)

value = -10
x = [value+n*0.01 for n in range(2001)]
y = [tanh(v) for v in x]


plt.plot(x, y)
ax = plt.gca()
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
ax.xaxis.set_ticks_position('bottom')
ax.spines['bottom'].set_position(('data',0))
ax.yaxis.set_ticks_position('left')
ax.spines['left'].set_position(('data',0))
plt.show()



Output:

We can get the same result.


Application

  • The tanh function also has the problem of gradient, just like the sigmoid function
  • Convergence when we training a model is slower than ReLU function

Leave a ReplyCancel reply

Exit mobile version