|
In probability theory, an ''ƒ''-divergence is a function ''D''''f'' (''P'' || ''Q'') that measures the difference between two probability distributions ''P'' and ''Q''. It helps the intuition to think of the divergence as an average, weighted by the function ''f'', of the odds ratio given by ''P'' and ''Q''. These divergences were introduced and studied independently by , and and are sometimes known as Csiszár ''ƒ''-divergences, Csiszár-Morimoto divergences or Ali-Silvey distances. == Definition == Let ''P'' and ''Q'' be two probability distributions over a space Ω such that ''P'' is absolutely continuous with respect to ''Q''. Then, for a convex function ''f'' such that ''f''(1) = 0, the ''f''-divergence of ''Q'' from ''P'' is defined as : If ''P'' and ''Q'' are both absolutely continuous with respect to a reference distribution ''μ'' on Ω then their probability densities ''p'' and ''q'' satisfy ''dP = p dμ'' and ''dQ = q dμ''. In this case the ''f''-divergence can be written as : The f-divergences can be expressed using Taylor series and rewritten using a weighted sum of chi-type distances (). 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「F-divergence」の詳細全文を読む スポンサード リンク
|