[Machine Learning] Introduction of Tanh function
Last Updated on 2021-06-06 by Clay
Tanh
Tanh function, the formula is:
Read More »[Machine Learning] Introduction of Tanh functionLast Updated on 2021-06-06 by Clay
Tanh function, the formula is:
Read More »[Machine Learning] Introduction of Tanh functionLast Updated on 2021-06-03 by Clay
Softmax function, mapping the vector between (0, 1), also represents the probability distribution of each element (classification class) in the vector.
Read More »[Machine Learning] Introduction of Softmax functionLast Updated on 2021-06-03 by Clay
Rectified Linear Unit (ReLU), is a famous activation function in neural network layer, it is believed to have sine degree if biological principle, although I don't know what it is. =)
Read More »[Machine Learning] Introduction of ReLULast Updated on 2021-06-03 by Clay
Sigmoid() function is a mapping function, it will map any variable (In the following content we write the the symbol x) to [0, 1]. And it is often used to be a activation function in neural network layer of Machine Learning.
Read More »[Machine Learning] Sigmoid function introduction