Advanced Linear Models for Data Science 1: Least Squares

Por: Coursera . en: ,

  • Background
    • We cover some basic matrix algebra results that we will need throughout the class. This includes some basic vector derivatives. In addition, we cover some some basic uses of matrices to create summary statistics from data. This includes calculating and subtracting means from observations (centering) as well as calculating the variance.
  • One and two parameter regression
    • In this module, we cover the basics of regression through the origin and linear regression. Regression through the origin is an interesting case, as one can build up all of multivariate regression with it.
  • Linear regression
    • In this lecture, we focus on linear regression, the most standard technique for investigating unconfounded linear relationships.
  • General least squares
    • We now move on to general least squares where an arbitrary full rank design matrix is fit to a vector outcome.
  • Least squares examples
    • Here we give some canonical examples of linear models to relate them to techniques that you may already be using.
  • Bases and residuals
    • Here we give a very useful kind of linear model, that is decomposing a signal into a basis expansion.