A Markov network , a Markov random field , or an undirected graph model is a graph model in which many random variables have the Markov property described by an undirected graph . The Markov network differs from another graph model, the Bayesian network , in the representation of dependencies between random variables. It can express some dependencies that the Bayesian network cannot express (for example, cyclic dependencies); on the other hand, she cannot express some others. The prototype of the Markov network was the Ising Model of the magnetization of material in statistical physics : the Markov network was presented as a generalization of this model. [one]
Content
- 1 Definition
- 2 Click factorization
- 3 Example
- 3.1 Logistic model
- 3.2 Gaussian Markov random field
- 4 notes
Definition
Let an undirected graph G = ( V , E ) be given, then the set of random variables ( X v ) v ∈ V of indexable V form a Markov random field with respect to G if they satisfy the following equivalent Markov properties:
- Pairs property : Any two non-adjacent variables are conditionally independent, taking into account all other variables:
- Local property : the variable is conditionally independent of all other quantities, taking into account its neighbors:
- where ne ( v ) is the set of neighbors of V , and cl ( v ) = { v } ∪ ne ( v ) is a closed neighborhood of v .
- Global property : Any two subsets of variables are conditionally independent, taking into account the separating subsets:
- where each path from a node in A to a node in B passes through S.
- Global property : Any two subsets of variables are conditionally independent, taking into account the separating subsets:
In other words, a graph G is considered a Markov random field with respect to the joint distributed probabilities P ( X = x ) on the set of random variables X if and only if the division of the graph G implies conditional independence: If two nodes and are separated in G after removal from G sets of nodes Z , then P ( X = x ) must state that and conditionally independent, taking into account random variables corresponding to Z. If this condition is fulfilled, then they say that G is an independent map (independencedency map) (or I-map) of the probability distribution.
Many definitions also require that G be a minimal I-card, that is, an I-card, when one edge is removed from it, it ceases to be an I-card. (This is reasonable to require, because it leads to the most compact representation, which includes as few dependencies as possible; note that a complete graph is a trivial I-map.) In the case where G is not only an I-map (that is, it does not represent independence, which are not indicated in P ( X = x )), but also does not represent dependencies that are not indicated in P ( X = x ), G is called a perfect map P ( X = x ). It represents a set of independence indicated P ( X = x ).
Click factorization
Since the Markov properties of an arbitrary probability distribution are difficult to establish, a class of Markov random fields is widely used, which can be factorized according to the cliques of the graph. The set of random variables X = ( X v ) v ∈ V for which the joint density can be factorized on the cliques G :
forms a Markov random field with respect to G , where cl ( G ) is the set of cliques of G (the definition is equivalent if only the maximum cliques are used). The functions φ C are often called factor potentials or click potentials. Although there are MRFs that do not decompose (a simple example can be built on a cycle of 4 nodes [2] ), in some cases it can be proved that they are in equivalent states:
- if the density is positive
- if the graph is harmonious
When such a decomposition exists, one can construct a graph factor for the network.
Example
Logistic Model
Logistic model of a Markov random field using a function how the functions of complete joint distribution can be written as
with distribution function
Where the set of possible distributions of the values of random variables of all networks.
Gaussian Markov Random Field
The forms of the multidimensional normal distribution of a Markov random field with respect to the graph G = ( V , E ), if the missing edges correspond to zeros in the accuracy matrix (inverse covariance matrix):
- [3]
Notes
- ↑ Kindermann, Ross. Markov Random Fields and Their Applications / Ross Kindermann, J. Laurie Snell. - American Mathematical Society, 1980. - ISBN MR : 0620955 .
- ↑ Moussouris, John. Gibbs and Markov random systems with constraints (English) // Journal of Statistical Physics : journal. - 1974. - Vol. 10 , no. 1 . - P. 11-33 . - DOI : 10.1007 / BF01011714 .
- ↑ Rue, Håvard. Gaussian Markov random fields: theory and applications / Håvard Rue, Leonhard Held. - CRC Press, 2005. - ISBN 1584884320 .