|
Randomness is the lack of pattern or predictability in events.〔The ''Oxford English Dictionary'' defines "random" as "Having no definite aim or purpose; not sent or guided in a particular direction; made, done, occurring, etc., without method or conscious choice; haphazard."〕 A random sequence of events, symbols or steps has no order and does not follow an intelligible pattern or combination. Individual random events are by definition unpredictable, but in many cases the frequency of different outcomes over a large number of events (or "trials") is predictable. For example, when throwing two dice, the outcome of any particular roll is unpredictable, but a sum of 7 will occur twice as often as 4. In this view, randomness is a measure of uncertainty of an outcome, rather than haphazardness, and applies to concepts of chance, probability, and information entropy. The fields of mathematics, probability, and statistics use formal definitions of randomness. In statistics, a random variable is an assignment of a numerical value to each possible outcome of an event space. This association facilitates the identification and the calculation of probabilities of the events. Random variables can appear in random sequences. A random process is a sequence of random variables whose outcomes do not follow a deterministic pattern, but follow an evolution described by probability distributions. These and other constructs are extremely useful in probability theory and the various applications of randomness. Randomness is most often used in statistics to signify well-defined statistical properties. Monte Carlo methods, which rely on random input (such as from random number generators or pseudorandom number generators), are important techniques in science, as, for instance, in computational science.〔(Third Workshop on Monte Carlo Methods ), Jun Liu, Professor of Statistics, Harvard University〕 By analogy, quasi-Monte Carlo methods use quasirandom number generators. Random selection is a method of selecting items (often called units) from a population where the probability of choosing a specific item is the proportion of those items in the population. For example, with a bowl containing just 10 red marbles and 90 blue marbles, a random selection mechanism would choose a red marble with probability 1/10. Note that a random selection mechanism that selected 10 marbles from this bowl would not necessarily result in 1 red and 9 blue. In situations where a population consists of items that are distinguishable, a random selection mechanism requires equal probabilities for any item to be chosen. That is, if the selection process is such that each member of a population, of say research subjects, has the same probability of being chosen then we can say the selection process is random. == History == (詳細はdivination to attempt to circumvent randomness and fate.〔''Handbook to life in ancient Rome'' by Lesley Adkins 1998 ISBN 0-19-512332-8 page 279〕〔''Religions of the ancient world'' by Sarah Iles Johnston 2004 ISBN 0-674-01517-7 page 370〕 The Chinese were perhaps the earliest people to formalize odds and chance 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the 16th century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of the calculus had a positive impact on the formal study of randomness. In the 1888 edition of his book ''The Logic of Chance'' John Venn wrote a chapter on ''The conception of randomness'' that included his view of the randomness of the digits of the number Pi by using them to construct a random walk in two dimensions.〔''Annotated readings in the history of statistics'' by Herbert Aron David, 2001 ISBN 0-387-98844-0 page 115. Note that the 1866 edition of Venn's book (on Google books) does not include this chapter.〕 The early part of the 20th century saw a rapid growth in the formal analysis of randomness, as various approaches to the mathematical foundations of probability were introduced. In the mid- to late-20th century, ideas of algorithmic information theory introduced new dimensions to the field via the concept of algorithmic randomness. Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the 20th century computer scientists began to realize that the ''deliberate'' introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases such randomized algorithms outperform the best deterministic methods. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「randomness」の詳細全文を読む スポンサード リンク
|