# Logistic Regression

A small introduction to logistic regression classification algorithm.

Logistic regression is a discriminative classification model.

##Types of classification models

1. Generative classification models: This approach models the class conditional densities $p(\textbf{x}|C_k)$ and class priors $p(C_k)$. And knowing these two quantities, we can get the posterior class probabilities, using Bayes theorem:

This model has more parameters and will spend more time on training with high dimensional data. The marginal density $p(\textbf{x})$ can be used to detect new data points that has low probability and, therefore, our classification won’t be precise in that case. by $\ref{eq}$

2. Discriminative models: This models find $p(C_k|\textbf{x})$ probabilities directly. In this case make decisions based only on the posterior class probabilities. Probabilites can help for example in case when we have a point that has nearly equal probabilities for each class. In that case we can avoid making decisions because they will be not so accurate. We won’t be able to do it when using discriminant functions that will be described in the next step. This model is also easier to train than generative model.

3. Discriminant function approach: Approach that is only has a discriminative function $f(\textbf{x})$ that maps each input to some particular class $C_k$. This approach doesn’t have a probabilistic interpretation and usually faster to train. You gain speed but loose probabilistic interpretation that can be useful in some cases.

##Deriving the logistic regression equation

First, let’s consider the example of two classes. The posterior probability of class $C_1$ can be written as:

Then, we will use a different presentation of the same equation:

Where:

This is the same equation and you can check that by substituting everything back.

And for the case of $K\gt2$:

We will rewrite it as:

Where:

The function $\ref{logistic}$ is called logistic function and function $\ref{softmax}$ is called softmax function.

<< Older