Information inequality (mathematical statistics) is an inequality for an unbiased estimate with a locally minimal dispersion, which sets the lower bound for the variance of this estimate. It plays an important role in the theory of asymptotically effective estimates [1] .
Wording
We denote - observational data, - parameter estimated on their basis, - conditional probability density of distribution, Fisher’s information as . Let be , - any statistics with for which the derivative with respect to from mathematical expectation exists and can be obtained by differentiation under the sign of the integral. Then the information inequality is true [2] : .
Notes
- ↑ Lehman, 1991 , p. 110.
- ↑ Lehman, 1991 , p. 116.
Literature
- Lehman E. Theory of point estimation. - M .: Nauka, 1991 .-- 448 p. - ISBN 5-02-013941-6 .