翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

E-statistic : ウィキペディア英語版
Energy distance

Energy distance is a statistical distance between probability distributions. If X and Y are independent random vectors in ''R''d with cumulative distribution functions F and G respectively, then the energy distance between the distributions F and G is defined to be the square root of
: D^2(F, G) = 2\mathbb E\|X - Y\| - \mathbb E\|X - X'\| - \mathbb E\|Y - Y'\| \geq 0,
where X, X' are independent and identically distributed (iid), Y, Y' are iid, \mathbb E is expected value, and || . || denotes the length of a vector. Energy distance satisfies all axioms of a metric thus energy distance characterizes the equality of distributions: D(F,G) = 0 if and only if F = G.
Energy distance for statistical applications was introduced in 1985 by Gábor J. Székely, who proved that for real-valued random variables this distance is exactly twice Harald Cramér's distance:〔Cramér, H. (1928) On the composition of elementary errors, Skandinavisk Aktuarietidskrift, 11, 141–180.〕
: \int_^\infty (F(x) - G(x))^2 \, dx .
For a simple proof of this equivalence, see Székely and Rizzo (2005).〔 (Reprint )〕 In higher dimensions, however, the two distances are different because the energy distance is rotation invariant while Cramér's distance is not. (Notice that Cramér's distance is not the same as the distribution-free Cramer-von-Mises criterion.)
==Generalization to metric spaces==

One can generalize the notion of energy distance to probability distributions on metric spaces. Let (M, d) be a metric space with its Borel sigma algebra \mathcal (M). Let \mathcal (M) denote the collection of all probability measures on the measurable space (M, \mathcal (M)). If μ and ν are probability measures in \mathcal (M), then the energy-distance D of μ and ν can be defined as the square root of
: D^2(\mu, \nu)= 2 \mathbb E() - \mathbb E() - \mathbb E() .
This is not necessarily non-negative, however. If (M, d) is a strongly negative definite kernel, then D is a metric, and conversely.〔Klebanov, L. B. (2005) N-distances and their Applications, Karolinum Press,
Charles University, Prague.〕 This condition is expressed by saying that (M, d) has negative type. Negative type is not sufficient for D to be a metric; the latter condition is expressed by saying that (M, d) has strong negative type. In this situation, the energy distance is zero if and only if X and Y are identically distributed. An example of a metric of negative type but not of strong negative type is the plane with the taxicab metric. All Euclidean spaces and even separable Hilbert spaces have strong negative type.〔 ()〕
In the literature on kernel methods for machine learning, these generalized notions of energy distance are studied under the name of maximum mean discrepancy.〔 ()〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Energy distance」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.