Skip to content

[Machine Learning] Introduction of Softmax function

Last Updated on 2021-06-03 by Clay

Softmax

Softmax function, mapping the vector between (0, 1), also represents the probability distribution of each element (classification class) in the vector.

Of course, sine it is a probability distribution, the sum of the vectors should be 1.

Let’s take a look for Softmax formula:

This formula is hard to understand, you can look directly at the code:

# -*- coding: utf-8 -*-
import numpy as np

inputs = np.array([1, 4, 9, 7, 5])

def softmax(inputs):
     return np.exp(inputs)/sum(np.exp(inputs))

outputs = softmax(inputs)
for n in range(len(outputs)):
     print('{} -> {}'.format(inputs[n], outputs[n]))



Output:

1 -> 0.00028901145493871657
4 -> 0.005804950249395781
9 -> 0.8615310049461178
7 -> 0.11659554257150641
5 -> 0.015779490778041354

And the sum is 1.

print(sum(outputs))



Output:

1.0

Application

  • You can call this function from Keras and PyTorch
  • Softmax function often used in multi-class prediction

Leave a ReplyCancel reply

Exit mobile version