If we have both the model classification results and correct answers, we can calculate the Binary Cross Entropy, it is a famous loss function.
Binary Cross Entropy is often used in binary classification task, but it can also used in multi-label classification.
The formula is:
The following assumes that we have a set of predicted labels and correct answers with such multi-label classification:
output = [-1, -2, -3, 1, 2, 3] target = [0, 1, 0, 0, 0, 1]
And we need to use sigmoid function (more info) to process the output data:
def sigmoid(x): return 1/(1+math.exp(-x)) output = [sigmoid(x) for x in output] print(output)
Output:
[0.2689414213699951, 0.11920292202211755, 0.04742587317756678, 0.7310585786300049, 0.8807970779778823, 0.9525741268224334]
Then we start to implement the function according to the above BCELoss formula:
def BCE(output, target): n = len(output) total_value = 0 for i in range(n): total_value += (target[i]*math.log(output[i])+(1-target[i])*math.log(1-output[i])) total_value *= -1/n print(total_value)
Then we use BCELoss function to calculate the cross entropy:
BCE(output, target)
Output:
0.9962590167116456
Check our function
Finally, we use the built-in nn.BCELoss()
function in PyTorch to check our answer’s reliability.
import torch import torch.nn as nn sigmoid = nn.Sigmoid() BCELoss = nn.BCELoss() output = torch.tensor([-1., -2., -3., 1., 2., 3.]) output = sigmoid(output) target = torch.tensor(([0., 1., 0., 0., 0., 1.])) print(BCELoss(output, target))
Output:
tensor(0.9963)
As you can see, the BCELoss packaged by PyTorch should be rounded.