Lecture 2, The truth about linear regression: Using Taylor's theorem to justify linear regression locally. Collinearity. Consistency of ordinary least squares estimates under weak conditions (non-Gaussian noise, non-independent noise, non-constant variance, dependent predictors). Linear regression coefficients will change with the distribution of the input variables: examples. Why \( R^2 \) is usually a distraction. Linear regression coefficients will change with the distribution of unobserved variables (omitted variable effects). Errors in variables. Transformations of inputs and of outputs. Utility of probabilistic assumptions; the importance of looking at the residuals. What it really means when coefficients are significantly non-zero. What "controlled for in a linear regression" really means.
Reading: Notes, chapter 2 (R); Notes, appendix B
Optional reading: Faraway, rest of chapter 1
Posted at January 26, 2013 21:35 | permanent link