## Documentation Center |

Regression models describe the relationship between a *dependent
variable*, *y*, and *independent
variable* or variables, *X*. The dependent
variable is also called the *response variable*.
Independent variables are also called *explanatory* or *predictor
variable**s*. Continuous predictor variables
might be called *covariates*, whereas categorical
predictor variables might be also referred to as *factors*.
The matrix, *X*, of observations on predictor variables
is usually called the *design matrix*.

A multiple linear regression model is

where

*y*is the_{i}*i*th response.*β*_{k}is the*k*th coefficient, where*β*_{0}is the constant term in the model. Sometimes, design matrices might include information about the constant term. However,`fitlm`or`stepwiselm`by default includes a constant term in the model, so you must not enter a column of 1s into your design matrix*X*.*X*is the_{ij}*i*th observation on the*j*th predictor variable,*j*= 1, ...,*p*.*ε*is the_{i}*i*th noise term, that is, random error.

In general, a linear regression model can be a model of the form

where *f* (.) is
a scalar-valued function of the independent variables, *X** _{ij}*s.
The functions,

Some examples of linear models are:

The following, however, are not linear models since they are
not linear in the unknown coefficients, *β*_{k}.

The usual assumptions for linear regression models are:

The noise terms,

*ε*, are uncorrelated._{i}The noise terms,

*ε*_{i}, have independent and identical normal distributions with mean zero and constant variance, σ^{2}. Thusand

So the variance of

*y*_{i}is the same for all levels of*X*_{ij}.The responses

*y*_{i}are uncorrelated.

The fitted linear function is

where
is the estimated response and *b _{k}*s
are the fitted coefficients. The coefficients are estimated so as
to minimize the mean squared difference between the prediction vector

In a linear regression model of the form *y* = *β*_{1}X_{1} +* β*_{2}X_{2} +
... + *β*_{p}X_{p},
the coefficient *β*_{k} expresses
the impact of a one-unit change in predictor variable, *X _{j}*,
on the mean of the response, E(

[1] Neter, J., M. H. Kutner, C. J. Nachtsheim, and W. Wasserman. *Applied
Linear Statistical Models*. IRWIN, The McGraw-Hill Companies,
Inc., 1996.

[2] Seber, G. A. F. *Linear Regression Analysis*.
Wiley Series in Probability and Mathematical Statistics. John Wiley
and Sons, Inc., 1977.

`fitlm` | `LinearModel` | `stepwiselm`

Was this topic helpful?