|
In statistics, M-estimators are a broad class of estimators, which are obtained as the minima of sums of functions of the data. Least-squares estimators are M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. The statistical procedure of evaluating an M-estimator on a data set is called M-estimation. More generally, an M-estimator may be defined to be a zero of an estimating function.〔V. P. Godambe, editor. ''Estimating functions'', volume 7 of Oxford Statistical Science Series. The Clarendon Press Oxford University Press, New York, 1991.〕〔Christopher C. Heyde. ''Quasi-likelihood and its application: A general approach to optimal parameter estimation''. Springer Series in Statistics. Springer-Verlag, New York, 1997.〕〔D. L. McLeish and Christopher G. Small. ''The theory and applications of statistical inference functions'', volume 44 of Lecture Notes in Statistics. Springer-Verlag, New York, 1988.〕〔Parimal Mukhopadhyay. ''An Introduction to Estimating Functions''. Alpha Science International, Ltd, 2004.〕〔Christopher G. Small and Jinfang Wang. ''Numerical methods for nonlinear estimating equations'', volume 29 of Oxford Statistical Science Series. The Clarendon Press Oxford University Press, New York, 2003.〕〔Sara A. van de Geer. ''Empirical Processes in M-estimation: Applications of empirical process theory,'' volume 6 of Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, 2000.〕 This estimating function is often the derivative of another statistical function: For example, a maximum-likelihood estimate is often defined to be a zero of the derivative of the likelihood function with respect to the parameter: thus, a maximum-likelihood estimator is often a critical point of the score function. In many applications, such M-estimators can be thought of as estimating characteristics of the population. ==Historical motivation== The method of least squares is a prototypical M-estimator, since the estimator is defined as a minimum of the sum of squares of the residuals. Another popular M-estimator is maximum-likelihood estimation. For a family of probability density functions ''f'' parameterized by ''θ'', a maximum likelihood estimator of ''θ'' is computed for each set of data by maximizing the likelihood function over the parameter space . When the observations are independent and identically distributed, a ML-estimate satisfies : or, equivalently, : Maximum-likelihood estimators are often inefficient and biased for finite samples. For many regular problems, maximum-likelihood estimation performs well for "large samples", being an approximation of a posterior mode. If the problem is "regular", then any bias of the MLE (or posterior mode) decreases to zero when the sample-size increases to infinity. The performance of maximum-likelihood (and posterior-mode) estimators drops when the parametric family is mis-specified. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「M-estimator」の詳細全文を読む スポンサード リンク
|