|
| kurtosis =| entropy =| mgf =| char =| pgf =| fisher = | }} In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli,〔James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45〕 is the probability distribution of a random variable which takes the value 1 with success probability of and the value 0 with failure probability of . It can be used to represent a coin toss where 1 and 0 would represent "head" and "tail" (or vice versa), respectively. In particular, unfair coins would have . The Bernoulli distribution is a special case of the two-point distribution, for which the two possible outcomes need not be 0 and 1. ==Properties== If is a random variable with this distribution, we have: : The probability mass function of this distribution, over possible outcomes ''k'', is : This can also be expressed as : The Bernoulli distribution is a special case of the binomial distribution with .〔McCullagh and Nelder (1989), Section 4.2.2.〕 The kurtosis goes to infinity for high and low values of , but for the two-point distributions including the Bernoulli distribution have a lower excess kurtosis than any other probability distribution, namely −2. The Bernoulli distributions for form an exponential family. The maximum likelihood estimator of based on a random sample is the sample mean. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Bernoulli distribution」の詳細全文を読む スポンサード リンク
|