Clever Geek Handbook
📜 ⬆️ ⬇️

Information inequality (mathematical statistics)

Information inequality (mathematical statistics) is an inequality for an unbiased estimate with a locally minimal dispersion, which sets the lower bound for the variance of this estimate. It plays an important role in the theory of asymptotically effective estimates [1] .

Wording

We denoteX {\ displaystyle X}   - observational data,θ {\ displaystyle \ theta}   - parameter estimated on their basis,p(X,θ) {\ displaystyle p (X, \ theta)}   - conditional probability density of distribution, Fisher’s information asI(θ)=E[∂∂θln⁡p(X,θ)]2 {\ displaystyle I (\ theta) = E \ left [{\ frac {\ partial} {\ partial \ theta}} \ ln p (X, \ theta) \ right] ^ {2}}   . Let beI(θ)>0 {\ displaystyle I (\ theta)> 0}   ,δ {\ displaystyle \ delta}   - any statistics withEθ(δ2)<∞ {\ displaystyle E _ {\ theta} (\ delta ^ {2}) <\ infty}   for which the derivative with respect toθ {\ displaystyle \ theta}   from mathematical expectationEθ(δ)=∫δpθdμ {\ displaystyle E _ {\ theta} (\ delta) = \ int \ delta p _ {\ theta} d \ mu}   exists and can be obtained by differentiation under the sign of the integral. Then the information inequality is true [2] :Dθ(δ)⩾[∂∂θEθ(δ)]2I(θ) {\ displaystyle D _ {\ theta} (\ delta) \ geqslant {\ frac {\ left [{\ frac {\ partial} {\ partial \ theta}} E _ {\ theta} (\ delta) \ right] ^ {2 }} {I (\ theta)}}}   .

Notes

  1. ↑ Lehman, 1991 , p. 110.
  2. ↑ Lehman, 1991 , p. 116.

Literature

  • Lehman E. Theory of point estimation. - M .: Nauka, 1991 .-- 448 p. - ISBN 5-02-013941-6 .
Source - https://ru.wikipedia.org/w/index.php?title=Informational Inequality_ ( math_statistics)&oldid = 82191949


More articles:

  • Kim Sokchin
  • Court of biys
  • Swimming at the 2016 Summer Paralympic Games
  • Day of Uprising against Occupation
  • Yambor (village)
  • Urumov, Tamerlan Mikhailovich
  • Charles I (Margrave of Baden)
  • Ross, Wilbur
  • Komissarzhevsky, Fedor Petrovich
  • Komov, Ivan Mikhailovich

All articles

Clever Geek | 2019