Bayesian inference

From WikiMD.org
Jump to navigation Jump to search

Bayesian Inference

Bayesian inference (pronunciation: /beɪziən ˈɪnfərəns/) is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics.

Etymology

The term "Bayesian" refers to Thomas Bayes (1701–1761), who proved a special case of what is now called Bayes' theorem. A detailed account of Bayes' life and work is given in the biography of Thomas Bayes.

Related Terms

  • Bayes' theorem: A theorem describing how the conditional probability of each of a set of possible causes for a given observed outcome can be computed from knowledge of the probability of each cause and the conditional probability of the outcome of each cause.
  • Prior probability: The probability of an event before new data is collected.
  • Posterior probability: The revised probability of an event occurring after taking into consideration new information.
  • Likelihood function: A function of the parameters of a statistical model, given specific observed data.
  • Statistical model: A mathematical representation of a set of data.
  • Hypothesis testing: A statistical method that uses sample data to evaluate a hypothesis about a population parameter.

See Also

External links

Esculaap.svg

This WikiMD article is a stub. You can help make it a full article.


Languages: - East Asian 中文, 日本, 한국어, South Asian हिन्दी, Urdu, বাংলা, తెలుగు, தமிழ், ಕನ್ನಡ,
Southeast Asian Indonesian, Vietnamese, Thai, မြန်မာဘာသာ, European español, Deutsch, français, русский, português do Brasil, Italian, polski