Clever Geek Handbook
📜 ⬆️ ⬇️

Random value

A random variable is a variable whose values ​​are the outcomes of some random phenomenon or experiment. In simple words: this is a numerical expression of the result of a random event. Random variable is one of the basic concepts of probability theory . [one]

To designate a random variable in mathematics, it is customary to use the capital version of the letter “X”X {\ displaystyle X} X . If we determine the random variable more strictly, then it is no longer a variableX {\ displaystyle X} X , and the functiony=X(ω) {\ displaystyle y = X (\ omega)} {\ displaystyle y = X (\ omega)} , valuesy {\ displaystyle y} y which numerically express the outcomesω {\ displaystyle \ omega} \ omega random phenomenon. One of the requirements for this function will be its measurability , which serves to screen out pathological cases when the values ​​of this functionX(ω) {\ displaystyle X (\ omega)} {\ displaystyle X (\ omega)} infinitely sensitive to the slightest change in the outcome of a random phenomenon, experiment.

It is important to understand that, as a function, a random variableX(ω) {\ displaystyle X (\ omega)} {\ displaystyle X (\ omega)} not likely to occurω {\ displaystyle \ omega} \ omega , but returns a numerical expression of the outcomeω {\ displaystyle \ omega} \ omega . For example, let an experimenter draw at random one of the cards in a deck of playing cards . Thenω {\ displaystyle \ omega} \ omega will represent one of the drawn cards; it is clear that hereω {\ displaystyle \ omega} \ omega not a number, but a map - a physical object whose name is indicated by a symbolω {\ displaystyle \ omega} \ omega . Then the functionX(ω) {\ displaystyle X (\ omega)} {\ displaystyle X (\ omega)} , taking the "name" of the object as an argument, it will return the number with which we will further associate the mapω {\ displaystyle \ omega} \ omega . Let in our case the experimenter pulled out King of Clubs, that isω=K♣ {\ displaystyle \ omega = K _ {\ clubsuit}} {\ displaystyle \ omega = K _ {\ clubsuit}} , then after substituting this outcome into the functionX(K♣) {\ displaystyle X (K _ {\ clubsuit})} {\ displaystyle X (K _ {\ clubsuit})} , we get already a number, for example, 13. This number is not the probability of drawing the king from the deck or any other card. This number is the result of transferring the object from the physical world to the object of the mathematical world, because with the number 13 it is already possible to carry out mathematical operations, while with the objectK♣ {\ displaystyle K _ {\ clubsuit}} {\ displaystyle K _ {\ clubsuit}} these operations could not be carried out.

An example of objects whose state requires the use of random variables is represented by microscopic objects described by quantum mechanics . Random variables describe the events of the transmission of hereditary traits from parental organisms to their descendants (see Mendel's Laws ). Random events include the radioactive decay of atomic nuclei. [one]

It should also be noted that there are a number of problems of mathematical analysis and number theory for which the functions involved in their formulations should be considered as random variables defined on suitable probability spaces [2] .

History

The role of a random variable, as one of the basic concepts of probability theory, was first clearly understood by P. L. Chebyshev , who substantiated the generally accepted point of view on this concept (1867) [3] . The understanding of random variables as a special case of the general concept of function came much later, in the first third of the 20th century. The first complete formalized representation of the foundations of probability theory based on measure theory was developed by A. N. Kolmogorov (1933) [4] , after which it became clear that a random variable is a measurable function defined on a probability space . In the educational literature, this point of view was first consecutively carried out by W. Feller (see the preface to [5] , where the presentation is based on the concept of the space of elementary events and emphasizes that only in this case does the representation of a random variable become meaningful).

Definition

The formal mathematical definition is as follows: let(Ω,F,P) {\ displaystyle (\ Omega, {\ mathcal {F}}, \ mathbb {P})}   - probability space , then a function is called a random variableX:Ω→R {\ displaystyle X \ colon \ Omega \ to \ mathbb {R}}   measurable with respect toF {\ displaystyle {\ mathcal {F}}}   and Borel σ-algebra onR {\ displaystyle \ mathbb {R}}   . The probabilistic behavior of a separate (independent of others) random variable is completely described by its distribution .

The random variable can be determined in another equivalent way [6] . FunctionX:Ω→R {\ displaystyle X \ colon \ Omega \ to \ mathbb {R}}   is called a random variable if, for any real numbersa {\ displaystyle a}   andb {\ displaystyle b}   many eventsω {\ displaystyle \ omega}   such thatX(ω)∈(a,b) {\ displaystyle X (\ omega) \ in (a, b)}   , belongsF {\ displaystyle {\ mathcal {F}}}   .

Examples

Discrete Random

Coin Flip

All possible outcomes of a coin toss can be described by the space of elementary events.Ω={ {\ displaystyle \ Omega = \ {}   heads, tails} {\ displaystyle \}}   or shortly{op,pe} {\ displaystyle \ {op, pe \}}   . Let a random variableX {\ displaystyle X}   It is equal to the gain that we get as a result of bets on the result of the outcome of the coin flip So let us get 10 rubles. every time a coinω {\ displaystyle \ omega}   drops out by an eagle, and −33 rubles. when falling tails. Mathematically this payoff function orX {\ displaystyle X}   can be represented as follows:

X(ω)={ten,ifω=op,-33,ifω=pe.{\ displaystyle X (\ omega) = {\ begin {cases} 10, & {\ text {if}} \ omega = {\ text {op}}, \\ [6pt] -33, & {\ text {if }} \ omega = {\ text {pe}}. \ end {cases}}}  

If the coin is perfect, then winX {\ displaystyle X}   will have a probability given as:

P(y)={one2,ify=ten,one2,ify=-33,{\ displaystyle P (y) = {\ begin {cases} {\ tfrac {1} {2}}, & {\ text {if}} y = 10, \\ [6pt] {\ tfrac {1} {2 }}, & {\ text {if}} y = -33, \ end {cases}}}  
 
If the outcome space is equal to the set of all possible combinations of points on two bones, and the random value is equal to the sum of these points, then S is a discrete random variable whose distribution is described by a probability function , the value of which is shown as the height of the corresponding column.
WhereP(y) {\ displaystyle P (y)}   - probability of receipty {\ displaystyle y}   rubles of gain when tossing a coin.

Dice Throwing

The random variable can also be used to describe the process of throwing dice, as well as to calculate the probability of a specific outcome of such throws. One of the classic examples of this experiment uses two dice n 1 and n 2 , each of which can take values ​​from the set {1, 2, 3, 4, 5, 6} (the number of points on the sides of the dice). The total points scored on the bones will be the value of our random variable.X {\ displaystyle X}   , which is set by the function:

X((none,n2))=none+n2{\ displaystyle X ((n_ {1}, n_ {2})) = n_ {1} + n_ {2}}  

and (if the bones are perfect) the probability function forX {\ displaystyle X}   is set through:

P(S)=min(S-one,13-S)36,forS∈{2,3,four,five,6,7,eight,9,ten,eleven,12}{\ displaystyle P (S) = {\ frac {\ min (S-1,13-S)} {36}}, {\ text {for}} S \ in \ {2,3,4,5,6 , 7,8,9,10,11,12 \}}   ,
WhereS {\ displaystyle S}   - the sum of points on the dropped bones.

Continuous Random

Random Passerby Growth

Let in one of the experiments you need to randomly select one person (we denote it as   ) from the group of subjects, let then a random variableX {\ displaystyle X}   expresses the growth of the person we have chosen   . In this case, from a mathematical point of view, a random variableX {\ displaystyle X}   interpreted as a functiony=X( {\ displaystyle y = X (}   ) {\ displaystyle)}   that transforms the test subject   in number - his growthy {\ displaystyle y}   . In order to calculate the probability that growth   will fall between 180 cm and 190 cm, or the probability that its growth will be above 150 cm, you need to know the probability distribution , which together withX {\ displaystyle X}   and allows you to calculate the probabilities of certain outcomes of random experiments.

Description Methods

It is possible to partially set a random variable, having described all its probabilistic properties as a separate random variable, using the distribution function , probability density and characteristic function , determining the probabilities of its possible values. The distribution function F (x) is the probability that the values ​​of the random variable are less than the real number x. From this definition it follows that the probability of a random variable falling into the interval [a, b) is F (b) -F (a). The advantage of using the distribution function is that with its help it is possible to achieve a uniform mathematical description of discrete, continuous and discrete-continuous random variables. However, there are different random variables having the same distribution functions.

If the random variable is discrete, then a complete and unambiguous mathematical description of its distribution is determined by indicating probabilitiespk=P(ξ=xk) {\ displaystyle p_ {k} = P (\ xi = x_ {k})}   all possible values ​​of this random variable. As an example, consider the binomial and Poisson distribution laws.

The binomial distribution law describes random variables whose values ​​determine the number of “successes” and “failures” when the experiment is repeated N times. In each experiment, “success” can occur with probability p, “failure” - with probability q = 1-p. The distribution law in this case is determined by the Bernoulli formula :

Pk,n=Cnk⋅pk⋅qn-k{\ displaystyle P_ {k, n} = C_ {n} ^ {k} \ cdot p ^ {k} \ cdot q ^ {nk}}   .

If in aspirationn {\ displaystyle n}   to infinity productnp {\ displaystyle np}   remains equal to constantλ>0 {\ displaystyle \ lambda> 0}   , then the binomial distribution law converges to Poisson's law , which is described by the following formula:

p(k)≡P(Y=k)=λkk!e-λ{\ displaystyle p (k) \ equiv \ mathbb {P} (Y = k) = {\ frac {\ lambda ^ {k}} {k!}} \, e ^ {- \ lambda}}   ,

Where

  • symbol "! {\ displaystyle!}   "Denotes the factorial ,
  • e=2.718281828...{\ displaystyle e = 2.718281828 \ ldots}   - the base of the natural logarithm .

Simple Generalizations

A random variable, generally speaking, can take values ​​in any measurable space. Then it is more often called a random vector or a random element. For example,

  • Measurable functionX:Ω→Rn {\ displaystyle X \ colon \ Omega \ to \ mathbb {R} ^ {n}}   called an n- dimensional random vector (relatively Borelσ {\ displaystyle \ sigma}   -algebras onRn {\ displaystyle \ mathbb {R} ^ {n}}   )
  • Measurable functionX:Ω→Cn {\ displaystyle X \ colon \ Omega \ to \ mathbb {C} ^ {n}}   is called an n- dimensional complex random vector (also with respect to the corresponding Borelσ {\ displaystyle \ sigma}   -algebras).
  • A measurable function that maps probability space to the space of subsets of some (finite) set is called a (finite) random set.

See also

  • Random process
  • Distribution function
  • Expected value

Notes

  1. ↑ 1 2 Prokhorov Yu. V. Random variable // Mathematical Encyclopedia / Ed. Vinogradova I.M. - M .: Soviet Encyclopedia, 1985.-T.5.- Page. 9.- 623 s.
  2. ↑ Katz M., Statistical independence in probability theory, analysis and number theory, trans. from English., M., 1963.
  3. ↑ Chebyshev P.L., On average values, in the book: Full. Sobr. Op., Vol. 2, M.-L., 1947
  4. ↑ Kolmogorov A.N., Basic concepts of probability theory, 2nd ed., M., 1974
  5. ↑ V. Feller, Introduction to Probability Theory and Its Applications, trans. from English, 2nd ed., vol. 1, M., 1967
  6. ↑ Error in footnotes ? : Invalid <ref> ; no text is specified for footnotes Чернова_СлВ

Literature

  • Gnedenko B.V. Course in probability theory. - 8th ed. add. and rev. - M .: URSS editorial, 2005 .-- 448 p. - ISBN 5-354-01091-8 .
  • Mathematical Encyclopedic Dictionary / Ch. ed. Prokhorov Yu. V. .. - 2nd ed. - M .: "Soviet Encyclopedia", 1998. - 847 p.
  • Tikhonov V.I., Kharisov V.N. Statistical analysis and synthesis of radio engineering devices and systems. - Textbook for universities. - M .: Radio and communications, 1991 .-- 608 p. - ISBN 5-256-00789-0 .
  • Chernova N.I. Probability Theory . - Tutorial. - Novosibirsk: Novosibirsk state. Univ., 2007 .-- 160 p.

Links

  • Multidimensional Random Variables
Source - https://ru.wikipedia.org/w/index.php?title=Random_value&oldid=99827083


More articles:

  • Gary (Sverdlovsk Oblast)
  • Druzhinino
  • Sublime
  • Shalya (Sverdlovsk Region)
  • Aktobe (Tatarstan)
  • Mullovka (Ulyanovsk Region)
  • New Main
  • Yuzhny (Volgograd)
  • Nectarius
  • Free Land (organization)

All articles

Clever Geek | 2019