Cross Entropy Loss
Interactive visualization of cross entropy loss for classification tasks
Binary Classification
Binary Cross Entropy Loss:
$$ L = -[y \log(p) + (1-y)\log(1-p)] $$
where y is the true label (0 or 1) and p is the predicted probability
True Label: Class 1 (Positive)
Class 0: 50%
Class 1: 50%
Loss Curve
Multi-class Classification
Multi-class Cross Entropy Loss:
$$ L = -\sum_{i=1}^{C} y_i \log(p_i) $$
where C is the number of classes, y_i is 1 for the true class and 0 for others
True Label: Cat
34%
33%
33%