Interpreting Regression Coefficients

Steph Locke explains what beta values on parameters in a regression actually signify:

When we read the list of coefficients, here is how we interpret them:

  • The intercept is the starting point – so if you knew no other information it would be the best guess.

  • Each coefficient multiplies the corresponding column to refine the prediction from the estimate. It tells us how much one unit in each column shifts the prediction.

  • When you use a categorical variable, in R the intercept represents the default position for a given value in the categorical column. Every other value then gets a modifier to the base prediction.

Linear regression is easy, but the real value here is Steph’s explanation of logistic regression coefficients.

Related Posts

Neural Networks From Scratch

Ilia Karmanov explains neural nets and shows how to build one in R: Hence, my motivation for this post is two-fold: Understanding (by writing from scratch) the leaky abstractions behind neural-networks dramatically shifted my focus to elements whose importance I initially overlooked. If my model is not learning I have a better idea of what […]

Read More

Using bsts In R

Steven L. Scott explains what the bsts package does: Time series data appear in a surprising number of applications, ranging from business, to the physical and social sciences, to health, medicine, and engineering. Forecasting (e.g. next month’s sales) is common in problems involving time series data, but explanatory models (e.g. finding drivers of sales) are […]

Read More