Bayes’s theorem

Bayes’s theorem any of several relationships between prior and posterior probabilities or odds, especially (1)–(3) below. All of these depend upon the basic relationship (0) between contemporaneous conditional and unconditional probabilities. Non-Bayesians think these useful only in narrow ranges of cases, generally because of skepticism about accessibility or significance of priors. According to (1), posterior probability is prior probability times the ‘relevance quotient’ (Carnap’s term). According to (2), posterior odds are prior odds times the ‘likelihood ratio’ (R. A. Fisher’s term). Relationship (3) comes from (1) by expanding P (data) via the law of total probability.
Bayes’s rule (4) for updating probabilities has you set your new unconditional probabilities equal to your old conditional ones when fresh certainty about data leaves probabilities conditionally upon the data unchanged. The corresponding rule (5) has you do the same for odds. In decision theory the term is used differently, for the rule ‘Choose so as to maximize expectation of utility.’ See also DECISION THEORY , PROBABILIT. R.J.

meaning of the word Bayes’s theorem root of the word Bayes’s theorem composition of the word Bayes’s theorem analysis of the word Bayes’s theorem find the word Bayes’s theorem definition of the word Bayes’s theorem what Bayes’s theorem means meaning of the word Bayes’s theorem emphasis in word Bayes’s theorem