--

Sorry but the formulas you gave for the log loss and cross entropy loss are wrong. The log loss is the same thing as the binary cross entropy loss. The formula you gave for the cross entropy loss is to be used for binary classification and should be called binary cross entropy. The formula you gave for the log loss is plain wrong. When you want to compute the cross entropy for a multi classes classification problem, you need to add up one term (y log(y_hat)) by class

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Adrien Biarnes
Adrien Biarnes

Responses (1)

Write a response