Level 26 Level 28
Level 27

Linear regression II


26 words 0 ignored

Ready to learn       Ready to review

Ignore words

Check the boxes below to ignore/unignore words, then click save at the bottom. Ignored words will never appear in any learning session.

All None

Ignore?
Ordinary least squares / OLS
Standard method for estimating the parameter beta in a linear regression model.
Generalized least squares / GLS
Method for estimating the parameters beta in a linear regression model when observations are correlated or heteroscedastic.
Errors-in-variables model
A regression model that accounts for measurement errors in the independent variables x.
Statistical error
The difference between an observed value and the unobservable true expected value.
Residual
The difference between an observed value and an estimated expected value.
Studentized residual
The quotient of a residual by an estimate of its standard deviation.
Anscombe's quartet
An exemplary set of four graphically completely different datasets with very similar statistical properties.
Weak exogeneity
Term for the standard assumption that the predictor variables are error-free fixed values instead of random variables
Overfitting
Term for applying an inappropriate regression model which is too complex for the given data and exaggerates noise.
BLUE / Best linear unbiased estimator
An unbiased linear estimator with the lowest variance as compared to other unbiased linear estimates.
Homoscedasticity
The property of a sequence of random variables with the same finite variance.
Gauss–Markov theorem
The OLS estimator is BLUE in a linear regression model when the errors have zero expectation, are uncorrelated and homoscedastic.
A solution exists
First property of a well-posed problem according to Hadamard.
The solution is unique
Second property of a well-posed problem according to Hadamard.
The solution is stable (under changes of the initial condition)
Third property of a well-posed problem according to Hadamard.
Regularization
A method introducing additional information in order to solve an ill-posed problem or to prevent overfitting.
Ridge regression / Tikhonov regularization
A method to regularize a least squares problem by adding an upper limit for the L2-norm of beta.
Lasso method
A method to regularize a least squares problem by adding an upper limit for the L1-norm of beta.
Lack of multicollinearity
The standard assumption that the design matrix has full column rank
Multiple linear regression
A linear regression with more than one scalar predictor variable x and one scalar response variable y.
General linear model
A generalization of multiple linear regression for a multivariate response variable y.
Generalized linear model / GLM
A generalization of ordinary linear regression for response variables that are not normally distributed.
Weighted least squares
A special case of GLS applied when the correlation matrix of the residuals is diagonal.
Hierarchical linear model / multilevel regression
Linear regression in the case when the predictor variables can be associated with certain levels.
RSS/TSS
Formula for the coefficient of determination R^2 when TSS=ESS+RSS holds.
Correlation coefficient
The coefficient of determination R^2 is the square of which quantity in simple linear regression?