Gopi Subramanian discusses one of my favorite regression concepts, heteroskedasticity:
With variance score of 0.43 linear regression did not do a good job overall. When the x values are close to 0, linear regression is giving a good estimate of y, but we near end of x values the predicted y is far way from the actual values and hence becomes completely meaningless.
Here is where Quantile Regression comes to rescue. I have used the python package statsmodels 0.8.0 for Quantile Regression.
Let us begin with finding the regression coefficients for the conditioned median, 0.5 quantile.
The article doesn’t render the code very well at all, but Gopi does have the example code on Github, so you can follow along that way.