Normalized Cross Entropy
Created | |
---|---|
Tags | Metrics |
- Always non-negative.
- Only 0 if your predictions match the labels perfectly.
- Unbounded; can grow arbitrarily large.
- Intuitive scale: NCE < 1: the model has learned something. NCE > 1: the model is less accurate than always predicting the average
- The lower the value, the better the model’s prediction.
- The reason for this normalization is that the closer the background CTR is to either 0 or 1, the easier it is to achieve a better log loss.
- Dividing by the entropy of the background CTR makes the NE insensitive to the background CTR.