We have reviewed linear regression in algebraic notation and have introduced the matrix notation and operations needed to continue with the more complicated models. This chapter presents the model, and develops the normal equations and solution to the normal equations for a general linear model involving any number of independent variables. The matrix formulation for the variances of linear functions is used to derive the measures of precision of the estimates.