prior probability


Also found in: Dictionary, Thesaurus, Legal, Financial, Encyclopedia, Wikipedia.

pri·or prob·a·bil·i·ty

the best rational assessment of the probability of an outcome on the basis of established knowledge before the present experiment is performed. For instance, the prior probability of the daughter of a carrier of hemophilia being herself a carrier of hemophilia is 1/2. But if the daughter already has an affected son, the posterior probability that she is a carrier is unity, whereas if she has a normal child, the posterior probability that she is a carrier is 1/3. See: Bayes theorem.

prevalence

Epidemiology
(1) The number of people with a specific condition or attribute at a specified time divided by the total number of people in the population.
(2) The number or proportion of cases, events or conditions in a given population.
 
Statistics
A term defined in the context of a 4-cell diagnostic matrix (2 X 2 table) as the amount of people with a disease, X, relative to a population.

Veterinary medicine
(1) A clinical estimate of the probability that an animal has a given disease, based on current knowledge (e.g., by history of physical exam) before diagnostic testing.
(2) As defined in a population, the probability at a specific point in time that an animal randomly selected from a group will have a particular condition, which is equivalent to the proportion of individuals in the group that have the disease. Group prevalence is calculated by dividing the number of individuals in a group that have a disease by the total number of individuals in the group at risk of the disease. Prevalence is a good measure of the amount of a chronic, low-mortality disease in a population, but is not of the amount of short duration or high-fatality disease. Prevalence is often established by cross-sectional surveys.

prior probability

Decision making The likelihood that something may occur or be associated with an event based on its prevalence in a particular situation. See Medical mistake, Representative heurisic.
References in periodicals archive ?
Given the global set C of query contents, an integer k > 0 and the prior probability of query contents as pr(.), the expected entropy of mechanism M([R.sup.k.sub.C], [Pr.sup.k.sub.C]) is calculated as
As in the source attribution example above, the set of sources with positive prior probability forms the relevant population for the DM.
starting point or prior probability for the likelihood that a specific
Determining this quantity requires assigning appropriate functional forms for the two input quantities for Bayesian inference: (1) the prior probability p(d | I) and (2) the likelihood function p(d | [theta],1).
where the prior probability density function in RGB channels are independent of the randomness of noise distribution.
where [M.sub.ij] is the cost of deciding H when the ground truth is [H.sub.j] and P([H.sub.j]) is the prior probability of the ground truth [H.sub.j].
The prior probability of service node determines the probability that an error occurs in a service node by historical data.
A Bayesian analysis of whether your partner is cheating on you requires a hypothesis (cheating), an alternative hypothesis or reason why the underwear would be there, and a prior probability you would have assigned to the cheating hypothesis before finding the underwear.
where P([[omega].sub.1]) the prior probability of [[omega].sub.1], [P.sub.n,i]([[[upsilon].sup.i.sub.n](t)) the prior probability of [[[upsilon].sup.i.sub.n](t), and the values for both of them are hard to decide.
Figure 4 Bayesian Hypothesis Statements State A : [mu] = [[mu].sub.A] with Prior Probability = [P.sub.A] State B : [mu] = [[mu].sub.B] with Prior Probability = [P.sub.B] The cost of accepting state A when state B is true (cost (A|B)) is compared to the cost of accepting state B when state A is true (cost (B|A)).