Logistic regression is a popular classifier for problems where the class label is binary and the feature variables are numeric. In linear regression, we were predicting y based on the values of . The model was: . The coefficients and were computed by minimizing the residual sum of squares. Now suppose that y is a binary class label instead of a numeric variable, and the ’s are numeric feature variables. We might be tempted to fit a least-squares linear regression model, but there are several problems. For example, our predictions for y may be below zero or above one, but in the new problem, y is a binary class label whose value can only be 0 or 1. So, it would be difficult to interpret the results.
Instead, we can use logistic regression. Here, the model is: . We can interpret z as the probability that y=1. Note that z can ranges between 0 and 1.
See the following Wikipedia article for more info: Logistic Regression Logistic Regression.
Maximum Likelihood Estimation of Logistic Regression