翻訳と辞書
Words near each other
・ Multichannel Multipoint Distribution Service
・ Multichannel News
・ Multichannel retailing
・ Multichannel television in Canada
・ Multichannel television sound
・ Multichannel video programming distributor
・ MultiChoice
・ Multicilia
・ Multicinema
・ Multicistronic message
・ Multiclass classification
・ Multiclavula
・ Multiclavula vernalis
・ Multiclet
・ Multicloud
Multicollinearity
・ Multicolor
・ Multicolor Active Galactic Nuclei Monitoring
・ Multicoloured Angels
・ Multicoloured Mosque
・ Multicoloured sea fan
・ Multicoloured tanager
・ Multicolumn countercurrent solvent gradient purification
・ MULTICOM
・ Multicommunicating
・ Multicomplex number
・ Multiconsult
・ Multicooker
・ Multicopper oxidase
・ Multicopy single-stranded DNA


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Multicollinearity : ウィキペディア英語版
Multicollinearity
In statistics, multicollinearity (also collinearity) is a phenomenon in which two or more predictor variables in a multiple regression model are highly correlated, meaning that one can be linearly predicted from the others with a substantial degree of accuracy. In this situation the coefficient estimates of the multiple regression may change erratically in response to small changes in the model or the data. Multicollinearity does not reduce the predictive power or reliability of the model as a whole, at least within the sample data set; it only affects calculations regarding individual predictors. That is, a multiple regression model with correlated predictors can indicate how well the entire bundle of predictors predicts the outcome variable, but it may not give valid results about any individual predictor, or about which predictors are redundant with respect to others.
In case of perfect multicollinearity the predictor matrix is singular and therefore cannot be inverted. Under these circumstances, for a general linear model y = X \beta + \epsilon, the ordinary least-squares estimator \hat_ = (X'X)^X'y does not exist.
Note that in statements of the assumptions underlying regression analyses such as ordinary least squares, the phrase "no multicollinearity" is sometimes used to mean the absence of perfect multicollinearity, which is an exact (non-stochastic) linear relation among the regressors.
==Definition==

Collinearity is a linear association between ''two'' explanatory variables. Two variables are perfectly collinear if there is an exact linear relationship between them. For example, X_ and X_ are perfectly collinear if there exist parameters \lambda_0 and \lambda_1 such that, for all observations ''i'', we have
: X_ = \lambda_0 + \lambda_1 X_.
Multicollinearity refers to a situation in which two or more explanatory variables in a multiple regression model are highly linearly related. We have perfect multicollinearity if, for example as in the equation above, the correlation between two independent variables is equal to 1 or -1. In practice, we rarely face perfect multicollinearity in a data set. More commonly, the issue of multicollinearity arises when there is an approximate linear relationship among two or more independent variables.
Mathematically, a set of variables is perfectly multicollinear if there exist one or more exact linear relationships among some of the variables. For example, we may have
:
\lambda_0 + \lambda_1 X_ + \lambda_2 X_ + \cdots + \lambda_k X_ = 0

holding for all observations ''i'', where \lambda_j are constants and X_ is the ''i''th observation on the ''j''th explanatory variable. We can explore one issue caused by multicollinearity by examining the process of attempting to obtain estimates for the parameters of the multiple regression equation
: Y_ = \beta _0 + \beta _1 X_ + \cdots + \beta _k X_ + \varepsilon _.
The ordinary least squares estimates involve inverting the matrix
: X^ X
where
: X = \begin
1 & X_ & \cdots & X_ \\
\vdots & \vdots & & \vdots \\
1 & X_ & \cdots & X_
\end.
If there is an exact linear relationship (perfect multicollinearity) among the independent variables, the rank of X (and therefore of XTX) is less than k+1, and the matrix XTX will not be invertible.
Perfect multicollinearity is fairly common when working with raw datasets, which frequently contain redundant information. Once redundancies are identified and removed, however, nearly multicollinear variables often remain due to correlations inherent in the system being studied. In such a case, instead of the above equation holding, we have that equation in modified form with an error term v_i:
:
\lambda_0 + \lambda_1 X_ + \lambda_2 X_ + \cdots + \lambda_k X_ + v_i = 0.

In this case, there is no exact linear relationship among the variables, but the X_j variables are nearly perfectly multicollinear if the variance of v_i is small for some set of values for the \lambda's. In this case, the matrix XTX has an inverse, but is ill-conditioned so that a given computer algorithm may or may not be able to compute an approximate inverse, and if it does so the resulting computed inverse may be highly sensitive to slight variations in the data (due to magnified effects of rounding error) and so may be very inaccurate.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Multicollinearity」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.