翻訳と辞書
Words near each other
・ Method Man
・ Method Man & Redman
・ Method Man (disambiguation)
・ Method Man (film)
・ Method Man (song)
・ Method Man discography
・ Method Man filmography
・ Method Man videography
・ Method Music
・ Method Mwanjali
・ Method of analytic tableaux
・ Method of averaging
・ Method of Breathing
・ Method of characteristics
・ Method of complements
Method of conditional probabilities
・ Method of continued fractions
・ Method of continuity
・ Method of difference
・ Method of distinguished element
・ Method of exhaustion
・ Method of factors
・ Method of Fluxions
・ Method of focal objects
・ Method of Four Russians
・ Method of fundamental solutions
・ Method of Harvard Principled Negotiation
・ Method of image charges
・ Method of images
・ Method of Levels


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Method of conditional probabilities : ウィキペディア英語版
Method of conditional probabilities
In mathematics and computer science, the probabilistic method is used to prove the existence of mathematical objects with desired combinatorial properties. The proofs are probabilistic — they work by showing that a random object, chosen from some probability distribution, has the desired properties with positive probability. Consequently, they are nonconstructive — they don't explicitly describe an efficient method for computing the desired objects.
The method of conditional probabilities , , converts such a proof, in a "very precise sense", into an efficient deterministic algorithm, one that is guaranteed to compute an object with the desired properties. That is, the method derandomizes the proof. The basic idea is to replace each random choice in a random experiment by a deterministic choice, so as to keep the conditional probability of failure, given the choices so far, below 1.
The method is particularly relevant in the context of randomized rounding (which uses the probabilistic method to design approximation algorithms).
When applying the method of conditional probabilities, the technical term pessimistic estimator refers to a quantity used in place of the true conditional probability (or conditional expectation) underlying the proof.
== Overview ==
gives this description:
: ''We first show the existence of a provably good approximate solution using the probabilistic method... (then ) show that the probabilistic existence proof can be converted, in a very precise sense, into a deterministic approximation algorithm.''
(Raghavan is discussing the method in the context of randomized rounding, but it works with the probabilistic method in general.)
To apply the method to a probabilistic proof, the randomly chosen object in the proof must be choosable by a random experiment that consists of a sequence of "small" random choices.
Here is a trivial example to illustrate the principle.
: Lemma: ''It is possible to flip three coins so that the number of tails is at least 2.''
: ''Probabilistic proof.'' If the three coins are flipped randomly, the expected number of tails is 1.5. Thus, there must be some outcome (way of flipping the coins) so that the number of tails is at least 1.5. Since the number of tails is an integer, in such an outcome there are at least 2 tails. ''QED''
In this example the random experiment consists of flipping three fair coins. The experiment is illustrated by the rooted tree in the diagram to the right. There are eight outcomes, each corresponding to a leaf in the tree. A trial of the random experiment corresponds to taking a random walk from the root (the top node in the tree, where no coins have been flipped) to a leaf. The successful outcomes are those in which at least two coins came up tails. The interior nodes in the tree correspond to partially determined outcomes, where only 0, 1, or 2 of the coins have been flipped so far.
To apply the method of conditional probabilities, one focuses on the ''conditional probability of failure, given the choices so far'' as the experiment proceeds step by step.
In the diagram, each node is labeled with this conditional probability. (For example, if only the first coin has been flipped, and it comes up tails, that corresponds to the second child of the root. Conditioned on that partial state, the probability of failure is 0.25.)
The method of conditional probabilities replaces the random root-to-leaf walk in the random experiment by a deterministic root-to-leaf walk, where each step is chosen to inductively maintain the following invariant:
:: ''the conditional probability of failure, given the current state, is less than 1.''
In this way, it is guaranteed to arrive at a leaf with label 0, that is, a successful outcome.
The invariant holds initially (at the root), because the original proof showed that the (unconditioned) probability of failure is less than 1. The conditional probability at any interior node is the average of the conditional probabilities of its children. The latter property is important because it implies that ''any interior node whose conditional probability is less than 1 has at least one child whose conditional probability is less than 1.'' Thus, from any interior node, one can always choose some child to walk to so as to maintain the invariant. Since the invariant holds at the end, when the walk arrives at a leaf and all choices have been determined, the outcome reached in this way must be a successful one.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Method of conditional probabilities」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.