Likelihood function: Difference between revisions
CSV import Tags: mobile edit mobile web edit |
CSV import |
||
| Line 26: | Line 26: | ||
{{stub}} | {{stub}} | ||
{{dictionary-stub1}} | {{dictionary-stub1}} | ||
<gallery> | |||
File:LikelihoodFunctionAfterHH.png|Likelihood function after observing HH | |||
File:LikelihoodFunctionAfterHHT.png|Likelihood function after observing HHT | |||
</gallery> | |||
Latest revision as of 02:05, 18 February 2025
Likelihood function is a fundamental concept in statistical inference, particularly in maximum likelihood estimation. It is a function of the parameters of a statistical model, given specific observed data. The likelihood of a set of parameter values, given outcomes x, is equal to the probability of those observed outcomes given those parameter values.
Definition[edit]
In the context of a statistical model, a likelihood function (often simply the likelihood) measures the goodness of fit of a statistical model to a sample of data for given values of the unknown parameters. It is formed from the joint probability distribution of the sample, but viewed and used as a function of the parameters only, thus treating the random variables as fixed at the observed values.
Properties[edit]
The likelihood function has several important properties that can be used in the process of parameter estimation. These include invariance to one-to-one transformations and factorization properties.
Applications[edit]
The likelihood function is used in a variety of statistical techniques including maximum likelihood estimation, likelihood-ratio test, and Bayesian inference.
See also[edit]
- Probability density function
- Probability mass function
- Sufficiency (statistics)
- Neyman–Pearson lemma
References[edit]
<references />



