Let be {\ displaystyle X_ {1}, \ ldots, X_ {n}} - independent identically distributed random variables , such that their distribution is given by a probability function :
- {\ displaystyle \ mathbb {P} (X_ {i} = j) = p_ {j}, \; j = 1, \ ldots, k} .
Intuitive event {\ displaystyle \ {X_ {i} = j \}} means test number {\ displaystyle i} led to the outcome {\ displaystyle j} . Let the random variable {\ displaystyle Y_ {j}} equal to the number of trials that led to the outcome {\ displaystyle j} :
- {\ displaystyle Y_ {j} = \ sum _ {i = 1} ^ {n} \ mathbf {1} _ {\ {X_ {i} = j \}}, \; j = 1, \ ldots, k} .
Then the distribution of the vector {\ displaystyle \ mathbf {Y} = (Y_ {1}, \ ldots, Y_ {k}) ^ {\ top}} has a probability function
- {\ displaystyle p _ {\ mathbf {Y}} (\ mathbf {y}) = \ left \ {{\ begin {matrix} {n \ choose {y_ {1} \ ldots y_ {k}}} p_ {1} ^ {y_ {1}} \ ldots p_ {k} ^ {y_ {k}}, & \ sum \ limits _ {j = 1} ^ {k} y_ {j} = n \\ 0, & \ sum \ limits _ {j = 1} ^ {k} y_ {j} \ not = n \ end {matrix}} \ right., \ quad \ mathbf {y} = (y_ {1}, \ ldots, y_ {k} ) ^ {\ top} \ in \ mathbb {N} _ {1} ^ {k}} ,
Where
- {\ displaystyle {n \ choose {y_ {1} \ ldots y_ {k}}} \ equiv {\ frac {n!} {y_ {1}! \ ldots y_ {k}!}}} - multinomial coefficient .
Mathematical expectation of a random variable {\ displaystyle Y_ {j}} has the form: {\ displaystyle \ mathbb {E} [Y_ {j}] = np_ {j}} . Diagonal elements of the covariance matrix {\ displaystyle \ Sigma = (\ sigma _ {ij})} are variances of binomial random variables, and therefore
- {\ displaystyle \ sigma _ {jj} = \ mathrm {D} [Y_ {j}] = np_ {j} (1-p_ {j}), \; j = 1, \ ldots, k} .
For the remaining items we have
- {\ displaystyle \ sigma _ {ij} = \ mathrm {cov} (Y_ {i}, Y_ {j}) = - np_ {i} p_ {j}, \; i \ not = j} .
The rank of the covariance matrix of the multinomial distribution is {\ displaystyle k-1} .