|
In information theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min-entropy. Entropies quantify the diversity, uncertainty, or randomness of a system. The Rényi entropy is named after Alfréd Rényi. The Rényi entropy is important in ecology and statistics as indices of diversity. The Rényi entropy is also important in quantum information, where it can be used as a measure of entanglement. In the Heisenberg XY spin chain model, the Rényi entropy as a function of ''α'' can be calculated explicitly by virtue of the fact that it is an automorphic function with respect to a particular subgroup of the modular group. In theoretical computer science, the min-entropy is used in the context of randomness extractors. == Definition == The Rényi entropy of order , where and , is defined as : .〔 Here, is a discrete random variable with possible outcomes and corresponding probabilities for , and the logarithm is base 2. If the probabilities are for all , then all the Rényi entropies of the distribution are equal: . In general, for all discrete random variables , is a non-increasing function in . Applications often exploit the following relation between the Rényi entropy and the ''p''-norm of the vector of probabilities: : . Here, the discrete probability distribution is interpreted as a vector in with and . The Rényi entropy for any is Schur concave. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Rényi entropy」の詳細全文を読む スポンサード リンク
|