Cross Entropy Loss

Interactive visualization of cross entropy loss for classification tasks

Binary Classification

Binary Cross Entropy Loss:

$$ L = -[y \log(p) + (1-y)\log(1-p)] $$

where y is the true label (0 or 1) and p is the predicted probability

True Label: Class 1 (Positive)
Class 0: 50% Class 1: 50%

Loss Curve