Though the main aim of all the methods is to define the association between the dependent and one or more independent variables, but different methods are preferable for performing regression in different situations depending upon the type and distribution of the data.
Some of the methods of regression are:
1. Linear Regression:
This is the most widely known and used method of regression. A relationship is established between dependent and one or more independent variables using a best fit straight line called a regression line. Simple linear regression has only one independent variable while multiple has more than one. The dependent variable in a linear regression model is always continuous while the independent ones can be either discrete or continuous.
2. Logistic Regression:
The dependent variable is binary in nature whereas the independent variables can be binary or continuous. A large sample size is required to perform the regression as maximum likelihood estimates are weak for small samples. If the values of the dependent variable are ordinal, it is called ordinal logistic regression whereas when the dependent variable is multi-class, it is called a multinomial logistic regression.
3. Polynomial Regression:
The regression model used with the power of independent variable (or variables) being more than one is called a polynomial regression model. The best fit line is in the form of a curve that fits into the data points.
4. Ridge Regression:
Where the data executes the property of multicollinearity, i.e., independent variables being highly correlated, regression is performed using ridge regression model. Ridge regression aims to reduce the standard error so that the difference between the observed and true value can be minimised by adding a degree of bias.
Lets Check what you have learned:
Written by Sarthak Goel ( pre final year in B.com (hons) from Hindu College and 4 Actuarial Papers passed from IFOA and IAI)