Beyond Binary Classification — Breaking down Multiple Logistic Regression to its basics | by Josep Ferrer | Feb, 2024

In the of and computer programs, the concept of might sound like a tough nut to crack, full of tricky math and complex ideas.

This is why I want to slow down and check out the basic stuff that makes all this work with a new issue of my MLBasics series.

Today’s agenda is giving our good old Logistic Regression a swanky upgrade.

Why?

By default, Logistic Regression is limited to two-class problems. However, we often face multiple-class problems.

So let’s dive into the fascinating world of leveling up Logistic Regression to be able to sort things into more than two baskets 👇🏻

In the ML field, Logistic Regression stands as an optimal for binary classification problems.

It is the trusted path towards .

Image by the author. Logistic Regression.
Image by the author. Logistic Regression.

However, there’s a big problem with Logistic Regression: It is like a coin toss — heads or tails, A or B.

But what if you have multiple ?

Image by Author. Multiple classes to classify.
Image by Author. Multiple classes to classify.

Logistic regression is not enough to handle a multiple-class classification. Therefore, to perform so, the model needs to be adapted and there are two main options:

  • The first simple approach is using multiple Simple Logistic to identify each one of the classes we want. It is a straightforward .
  • A second approach is to generate a new model that accepts multiple classes.

So let’s break down both approaches:

Source link