Machine learning
Logarithmic loss (or cross-entropy)
Logarithmic loss (or cross-entropy) measures the performance of a classification model where the prediction input is a probability value between 0 and 1. The goal of our machine learning models is to minimize this value. It is also heavily used in Kaggle competitions to estimate the score of submissions. A Read more…