Discover more from Daily Dose of Data Science
The Taxonomy Of Regression Algorithms That Many Don't Bother To Remember
8 standard regression algorithms summarised in a single frame.
Regression algorithms allow us to model the relationship between a dependent variable and one or more independent variables.
After estimating the parameters of a regression model, we can gain insight into how changes in one variable affect another.
Being widely used in data science, an awareness of their various forms is crucial to precisely convey which algorithm you are using.
Here are eight of the most standard regression algorithms described in a single line:
Simple linear regression: One independent (x) and one dependent (y) variable.
Polynomial Linear Regression: Polynomial features and one dependent (y) variable.
Multiple Linear Regression: Arbitrary features and one dependent (y) variable.
Lasso Regression: Linear Regression with L1 Regularization.
Ridge Regression: Linear Regression with L2 Regularization.
Elastic Net: Linear Regression with BOTH L1 and L2 Regularization.
Categorical Probability Prediction
Logistic Regression: Predict binary outcome probability.
Multinomial Logistic Regression (or Softmax Regression): Predict multiple categorical probabilities.
Over to you: What other regressions algorithms will you include here?
Thanks for reading Daily Dose of Data Science! Subscribe for free to learn something new and insightful about Python and Data Science every day. Also, get a Free Data Science PDF (250+ pages) with 200+ tips.
👉 If you liked this post, don’t forget to leave a like ❤️. It helps more people discover this newsletter on Substack and tells me that you appreciate reading these daily insights. The button is located towards the bottom of this email.
👉 If you love reading this newsletter, feel free to share it with friends!
Find the code for my tips here: GitHub.