The maximum likelihood function ( eng. Marginal Likelihood Function ) or integrated likelihood ( eng. Integrated likelihood ) is a likelihood function in which some variable parameters are excluded . In the context of Bayesian statistics , a function can be called a validity ( English evidence ) or a validity model ( English model evidence ).
Content
Concept
If given a set of independent equally distributed data points where parameter according to some probability distribution with parameter where parameter itself is a random variable given by the distribution, that is, . The maximum likelihood function generally asks what the probability of an event is. where excluded (by integration over this parameter):
The definition above is formulated in the context of Bayesian statistics . In classical ( ) statistics, the concept of maximum likelihood appears instead in the context of the joint parameter where is the actual parameter as well is an . If there is a probability distribution for it is often desirable to consider the likelihood function only in terms of by exception :
Unfortunately, ultimate likelihood is usually difficult to calculate. Exact solutions are known for a small class of distributions, in particular, when the excluded parameter is the conjugate prior distribution of the data distribution. In other cases, you need some kind of numerical integration method, either a general integration method, such as the Gauss method or the Monte Carlo method, or a method developed specifically for statistical problems, such as the Laplace approximation , Gibbs / Metropolis sampling , or the EM algorithm .
You can also apply the above conventions to a single random variable (data point) x , and not to a set of observations. In the context of Bayesian theory, this is equivalent to the a data point.
Applications
Comparison of Bayesian models
When comparing Bayesian models, the excluded variables are parameters for a particular type of model, and the remaining variables are characteristics of the model. In this case, the maximum likelihood is the probability of the data for a given type of model without assuming the values of any particular parameters. The maximum likelihood function for model M is
It is in this context that the term model validity is commonly used. This value is important because the a posteriori odds ratio for the M 1 model and the other M 2 model involves the ratio of the maximum likelihood functions, the so-called Bayes coefficient :
which can be schematically formulated as
- a posteriori odds = a priori odds × Bayes ratio
See also
- Private distribution
- Paradox lindley
Notes
Literature
- Charles S. Bos. A comparison of marginal likelihood computation methods // COMPSTAT 2002: Proceedings in Computational Statistics / W. Härdle, B. Ronz. - 2002. - p. 111-117. (The book is available as a preprint on the website: [1] )
- David JC MacKay. Information Theory, Inference, and Learning Algorithms . - Cambridge University Press, 2003. - ISBN 0521642981 .