A continuous uniform distribution — in probability theory — is the distribution of a random real quantity taking values belonging to the interval [a, b], characterized in that the probability density on this interval is constant.
| Continuous uniform distribution |
|---|
Probability density |
Distribution function |
| Designation | {\ displaystyle {\ mathcal {U}} (a, b)} , {\ displaystyle Rav (a, b)}  |
|---|
| Options | {\ displaystyle a, b \ in (- \ infty, \ infty)} , {\ displaystyle a} Is the shear coefficient , {\ displaystyle ba} - scale factor |
|---|
| Carrier | {\ displaystyle a \ leqslant x \ leqslant b}  |
|---|
| Probability density | {\ displaystyle {\ begin {matrix} {\ dfrac {1} {ba}} & a \ leqslant x \ leqslant b \\\\ 0 & \ x <a \, \ x> b \ end {matrix}}}  |
|---|
| Distribution function | {\ displaystyle {\ begin {matrix} 0 & x <a \\ {\ dfrac {xa} {ba}} & ~~~~~ a \ leqslant x <b \\ 1 & x \ geqslant b \ end {matrix}}}  |
|---|
| Expected value | {\ displaystyle {\ frac {a + b} {2}}}  |
|---|
| Median | {\ displaystyle {\ frac {a + b} {2}}}  |
|---|
| Fashion | any number from a segment {\ displaystyle [a, b]} ![[a, b]](https://wikimedia.org/api/rest_v1/media/math/render/svg/9c4b788fc5c637e26ee98b45f89a5c08c85f7935) |
|---|
| Dispersion | {\ displaystyle {\ frac {(ba) ^ {2}} {12}}}  |
|---|
| Asymmetry coefficient | {\ displaystyle 0}  |
|---|
| Excess ratio | {\ displaystyle - {\ frac {6} {5}}}  |
|---|
| Differential entropy | {\ displaystyle \ ln (ba)}  |
|---|
| The generating function of moments | {\ displaystyle {\ frac {e ^ {tb} -e ^ {ta}} {t (ba)}}}  |
|---|
| Characteristic function | {\ displaystyle {\ frac {e ^ {itb} -e ^ {ita}} {it (ba)}}}  |
|---|
Content
DefinitionIt is said that a random variable has a continuous uniform distribution over a segment {\ displaystyle [a, b]} where {\ displaystyle a, b \ in \ mathbb {R}} if its density {\ displaystyle f_ {X} (x)} has the form:
- {\ displaystyle f_ {X} (x) = \ left \ {{\ begin {matrix} {1 \ over ba}, & x \ in [a, b] \\ 0, & x \ not \ in [a, b] \ end {matrix}} \ right ..}
They write: {\ displaystyle X \ sim U [a, b]} . Sometimes density values at boundary points {\ displaystyle x = a} and {\ displaystyle x = b} change to others, for example {\ displaystyle 0} or {\ displaystyle {\ frac {1} {2 (ba)}}} . Since the Lebesgue integral of density does not depend on the behavior of the latter on sets of measure zero, these variations do not affect the calculations of the probabilities associated with this distribution.
Distribution FunctionIntegrating the density defined above, we obtain:
- {\ displaystyle F_ {X} (x) \ equiv \ mathbb {P} (X \ leqslant x) = \ left \ {{\ begin {matrix} 0, & x <a \\ {\ dfrac {xa} {ba} }, & a \ leqslant x <b \\ 1, & x \ geqslant b \ end {matrix}} \ right ..}
Since the density of the uniform distribution is discontinuous at the boundary points of the segment {\ displaystyle [a, b]} , then the distribution function at these points is not differentiable. At the remaining points, the standard equality holds:
- {\ displaystyle {\ frac {d} {dx}} F_ {X} (x) = f_ {X} (x), \; \ forall x \ in \ mathbb {R} \ setminus \ {a, b \} } .
The generating function of momentsBy simple integration we obtain the generating function of moments :
- {\ displaystyle M_ {X} (t) = {\ frac {e ^ {tb} -e ^ {ta}} {t (ba)}}} ,
whence we find all the interesting moments of the continuous uniform distribution:
- {\ displaystyle \ mathbb {E} \ left [X \ right] = {\ frac {a + b} {2}}} ,
- {\ displaystyle \ mathbb {E} \ left [X ^ {2} \ right] = {\ frac {a ^ {2} + ab + b ^ {2}} {3}}} ,
- {\ displaystyle \ operatorname {D} \ left [X \ right] = {\ frac {(ba) ^ {2}} {12}}} .
At all,
- {\ displaystyle \ mathbb {E} \ left [X ^ {n} \ right] = {\ frac {1} {n + 1}} \ sum \ limits _ {k = 0} ^ {n} {a ^ { k} b ^ {nk}} = {\ frac {b ^ {n + 1} -a ^ {n + 1}} {(ba) (n + 1)}}} .
Standard uniform distributionIf a {\ displaystyle a = 0} and {\ displaystyle b = 1} , i.e {\ displaystyle X \ sim U [0,1]} , then such a continuous uniform distribution is called standard .
There is an elementary statement:
- If a random variable {\ displaystyle X \ sim U [0,1]} and {\ displaystyle Y = a + (ba) X} then {\ displaystyle Y \ sim U [\ min (a, b), \ max (a, b)]} .
Thus, having a random sample generator from a standard continuous uniform distribution, it is easy to construct a sample generator of any continuous uniform distribution.
Moreover, having such a generator and knowing the function inverse to the distribution function of a random variable, it is possible to construct a sample generator of any continuous distribution (not necessarily uniform) using the inverse transformation method . Therefore, standard uniformly distributed random variables are sometimes called basic random variables .
There are also partial transformations that allow, on the basis of uniform distribution, to obtain random distributions of a different kind. So, for example, to obtain a normal distribution , the Box - Muller transformation is used .
See also- Discrete uniform distribution ;
- Inverse Transformation Method .