Mathematical expectation of a discrete one. Mathematical expectation (Population mean) is

2. Basics of probability theory

Expected value

Consider a random variable with numerical values. It is often useful to associate a number with this function - its “mean value” or, as they say, “average value”, “index of central tendency”. For a number of reasons, some of which will become clear later, the mathematical expectation is usually used as the “average value”.

Definition 3. Mathematical expectation of a random variable X called number

those. the mathematical expectation of a random variable is a weighted sum of the values ​​of a random variable with weights equal to the probabilities of the corresponding elementary events.

Example 6. Let's calculate the mathematical expectation of the number that appears on the top face of the die. It follows directly from Definition 3 that

Statement 2. Let the random variable X takes values x 1, x 2,…, xm. Then the equality is true

(5)

those. The mathematical expectation of a random variable is a weighted sum of the values ​​of the random variable with weights equal to the probabilities that the random variable takes certain values.

Unlike (4), where the summation is carried out directly over elementary events, a random event can consist of several elementary events.

Sometimes relation (5) is taken as the definition of mathematical expectation. However, using Definition 3, as shown below, it is easier to establish the properties of the mathematical expectation necessary for constructing probabilistic models of real phenomena than using relation (5).

To prove relation (5), we group into (4) terms with identical values ​​of the random variable:

Since the constant factor can be taken out of the sign of the sum, then

By determining the probability of an event

Using the last two relations we obtain the required:

The concept of mathematical expectation in probabilistic-statistical theory corresponds to the concept of the center of gravity in mechanics. Let's put it in points x 1, x 2,…, xm on the mass number axis P(X= x 1 ), P(X= x 2 ),…, P(X= x m) respectively. Then equality (5) shows that the center of gravity of this system of material points coincides with the mathematical expectation, which shows the naturalness of Definition 3.

Statement 3. Let X- random value, M(X)– its mathematical expectation, A– a certain number. Then

1) M(a)=a; 2) M(X-M(X))=0; 3M[(X- a) 2 ]= M[(X- M(X)) 2 ]+(a- M(X)) 2 .

To prove this, let us first consider a random variable that is constant, i.e. the function maps the space of elementary events to a single point A. Since the constant multiplier can be taken beyond the sign of the sum, then

If each member of a sum is divided into two terms, then the whole sum is divided into two sums, of which the first is made up of the first terms, and the second is made up of the second. Therefore, the mathematical expectation of the sum of two random variables X+Y, defined on the same space of elementary events, is equal to the sum of mathematical expectations M(X) And M(U) these random variables:

M(X+Y) = M(X) + M(Y).

And therefore M(X-M(X)) = M(X) - M(M(X)). As shown above, M(M(X)) = M(X). Hence, M(X-M(X)) = M(X) - M(X) = 0.

Because the (X - a) 2 = ((XM(X)) + (M(X) - a)} 2 = (X - M(X)) 2 + 2(X - M(X))(M(X) - a) + (M(X) – a) 2 , That M[(X - a) 2 ] =M(X - M(X)) 2 + M{2(X - M(X))(M(X) - a)} + M[(M(X) – a) 2 ]. Let's simplify the last equality. As shown at the beginning of the proof of Statement 3, the mathematical expectation of a constant is the constant itself, and therefore M[(M(X) – a) 2 ] = (M(X) – a) 2 . Since the constant multiplier can be taken beyond the sign of the sum, then M{2(X - M(X))(M(X) - a)} = 2(M(X) - a)M(X - M(X)). The right side of the last equality is 0 because, as shown above, M(X-M(X))=0. Hence, M[(X- a) 2 ]= M[(X- M(X)) 2 ]+(a- M(X)) 2 , which was what needed to be proven.

From the above it follows that M[(X- a) 2 ] reaches a minimum A, equal M[(X- M(X)) 2 ], at a = M(X), since the second term in equality 3) is always non-negative and equals 0 only for the specified value A.

Statement 4. Let the random variable X takes values x 1, x 2,…, xm, and f is some function of the numerical argument. Then

To prove this, let’s group on the right side of equality (4), which defines the mathematical expectation, terms with the same values:

Using the fact that the constant factor can be taken out of the sign of the sum, and the definition of the probability of a random event (2), we obtain

Q.E.D.

Statement 5. Let X And U– random variables defined on the same space of elementary events, A And b- some numbers. Then M(aX+ bY)= aM(X)+ bM(Y).

Using the definition of the mathematical expectation and the properties of the summation symbol, we obtain a chain of equalities:

The required has been proven.

The above shows how the mathematical expectation depends on the transition to another reference point and to another unit of measurement (transition Y=aX+b), as well as to functions of random variables. The results obtained are constantly used in technical and economic analysis, in assessing the financial and economic activities of an enterprise, during the transition from one currency to another in foreign economic calculations, in regulatory and technical documentation, etc. The results under consideration allow the use of the same calculation formulas for various parameters scale and shift.

Previous

Probability theory is a special branch of mathematics that is studied only by students of higher educational institutions. Do you like calculations and formulas? Aren't you scared by the prospects of getting acquainted with the normal distribution, ensemble entropy, mathematical expectation and dispersion of a discrete random variable? Then this subject will be very interesting to you. Let's get acquainted with several of the most important basic concepts of this branch of science.

Let's remember the basics

Even if you remember the simplest concepts of probability theory, do not neglect the first paragraphs of the article. The point is that without a clear understanding of the basics, you will not be able to work with the formulas discussed below.

So, some random event occurs, some experiment. As a result of the actions we take, we can get several outcomes - some of them occur more often, others less often. The probability of an event is the ratio of the number of actually obtained outcomes of one type to the total number of possible ones. Only knowing the classical definition of this concept can you begin to study the mathematical expectation and dispersion of continuous random variables.

Average

Back in school, during math lessons, you started working with the arithmetic mean. This concept is widely used in probability theory, and therefore cannot be ignored. The main thing for us at the moment is that we will encounter it in the formulas for the mathematical expectation and dispersion of a random variable.

We have a sequence of numbers and want to find the arithmetic mean. All that is required of us is to sum up everything available and divide by the number of elements in the sequence. Let us have numbers from 1 to 9. The sum of the elements will be equal to 45, and we will divide this value by 9. Answer: - 5.

Dispersion

In scientific terms, dispersion is the average square of deviations of the obtained values ​​of a characteristic from the arithmetic mean. It is denoted by one capital Latin letter D. What is needed to calculate it? For each element of the sequence, we calculate the difference between the existing number and the arithmetic mean and square it. There will be exactly as many values ​​as there can be outcomes for the event we are considering. Next, we sum up everything received and divide by the number of elements in the sequence. If we have five possible outcomes, then divide by five.

Dispersion also has properties that need to be remembered in order to be used when solving problems. For example, when increasing a random variable by X times, the variance increases by X squared times (i.e. X*X). It is never less than zero and does not depend on shifting values ​​up or down by equal amounts. Additionally, for independent trials, the variance of the sum is equal to the sum of the variances.

Now we definitely need to consider examples of the variance of a discrete random variable and the mathematical expectation.

Let's say we ran 21 experiments and got 7 different outcomes. We observed each of them 1, 2, 2, 3, 4, 4 and 5 times, respectively. What will the variance be equal to?

First, let's calculate the arithmetic mean: the sum of the elements, of course, is 21. Divide it by 7, getting 3. Now subtract 3 from each number in the original sequence, square each value, and add the results together. The result is 12. Now all we have to do is divide the number by the number of elements, and, it would seem, that’s all. But there's a catch! Let's discuss it.

Dependence on the number of experiments

It turns out that when calculating variance, the denominator can contain one of two numbers: either N or N-1. Here N is the number of experiments performed or the number of elements in the sequence (which is essentially the same thing). What does this depend on?

If the number of tests is measured in hundreds, then we must put N in the denominator. If in units, then N-1. Scientists decided to draw the border quite symbolically: today it passes through the number 30. If we conducted less than 30 experiments, then we will divide the amount by N-1, and if more, then by N.

Task

Let's return to our example of solving the problem of variance and mathematical expectation. We got an intermediate number 12, which needed to be divided by N or N-1. Since we conducted 21 experiments, which is less than 30, we will choose the second option. So the answer is: the variance is 12 / 2 = 2.

Expected value

Let's move on to the second concept, which we must consider in this article. The mathematical expectation is the result of adding all possible outcomes multiplied by the corresponding probabilities. It is important to understand that the obtained value, as well as the result of calculating the variance, is obtained only once for the entire problem, no matter how many outcomes are considered in it.

The formula for mathematical expectation is quite simple: we take the outcome, multiply it by its probability, add the same for the second, third result, etc. Everything related to this concept is not difficult to calculate. For example, the sum of the expected values ​​is equal to the expected value of the sum. The same is true for the work. Not every quantity in probability theory allows you to perform such simple operations. Let's take the problem and calculate the meaning of two concepts we have studied at once. Besides, we were distracted by theory - it's time to practice.

One more example

We ran 50 trials and got 10 types of outcomes - numbers from 0 to 9 - appearing in different percentages. These are, respectively: 2%, 10%, 4%, 14%, 2%,18%, 6%, 16%, 10%, 18%. Recall that to obtain probabilities, you need to divide the percentage values ​​by 100. Thus, we get 0.02; 0.1, etc. Let us present an example of solving the problem for the variance of a random variable and the mathematical expectation.

We calculate the arithmetic mean using the formula that we remember from elementary school: 50/10 = 5.

Now let’s convert the probabilities into the number of outcomes “in pieces” to make it easier to count. We get 1, 5, 2, 7, 1, 9, 3, 8, 5 and 9. From each value obtained, we subtract the arithmetic mean, after which we square each of the results obtained. See how to do this using the first element as an example: 1 - 5 = (-4). Next: (-4) * (-4) = 16. For other values, do these operations yourself. If you did everything correctly, then after adding them all up you will get 90.

Let's continue calculating the variance and expected value by dividing 90 by N. Why do we choose N rather than N-1? Correct, because the number of experiments performed exceeds 30. So: 90/10 = 9. We got the variance. If you get a different number, don't despair. Most likely, you made a simple mistake in the calculations. Double-check what you wrote, and everything will probably fall into place.

Finally, remember the formula for mathematical expectation. We will not give all the calculations, we will only write an answer that you can check with after completing all the required procedures. The expected value will be 5.48. Let us only recall how to carry out operations, using the first elements as an example: 0*0.02 + 1*0.1... and so on. As you can see, we simply multiply the outcome value by its probability.

Deviation

Another concept closely related to dispersion and mathematical expectation is standard deviation. It is denoted either by the Latin letters sd, or by the Greek lowercase “sigma”. This concept shows how much on average the values ​​deviate from the central feature. To find its value, you need to calculate the square root of the variance.

If you plot a normal distribution graph and want to see the squared deviation directly on it, this can be done in several stages. Take half of the image to the left or right of the mode (central value), draw a perpendicular to the horizontal axis so that the areas of the resulting figures are equal. The size of the segment between the middle of the distribution and the resulting projection onto the horizontal axis will represent the standard deviation.

Software

As can be seen from the descriptions of the formulas and the examples presented, calculating variance and mathematical expectation is not the simplest procedure from an arithmetic point of view. In order not to waste time, it makes sense to use the program used in higher education institutions - it is called “R”. It has functions that allow you to calculate values ​​for many concepts from statistics and probability theory.

For example, you specify a vector of values. This is done as follows: vector<-c(1,5,2…). Теперь, когда вам потребуется посчитать какие-либо значения для этого вектора, вы пишете функцию и задаете его в качестве аргумента. Для нахождения дисперсии вам нужно будет использовать функцию var. Пример её использования: var(vector). Далее вы просто нажимаете «ввод» и получаете результат.

Finally

Dispersion and mathematical expectation are without which it is difficult to calculate anything in the future. In the main course of lectures at universities, they are discussed already in the first months of studying the subject. It is precisely because of the lack of understanding of these simple concepts and the inability to calculate them that many students immediately begin to fall behind in the program and later receive bad grades at the end of the session, which deprives them of scholarships.

Practice for at least one week, half an hour a day, solving tasks similar to those presented in this article. Then, on any test in probability theory, you will be able to cope with the examples without extraneous tips and cheat sheets.

The most complete characteristic of a random variable is its distribution law. However, it is not always known and in these cases one has to be content with less information. Such information may include: the range of change of a random variable, its largest (smallest) value, some other characteristics that describe the random variable in some summary way. All these quantities are called numerical characteristics random variable. Usually these are some non-random numbers that somehow characterize a random variable. The main purpose of numerical characteristics is to express in a concise form the most significant features of a particular distribution.

The simplest numerical characteristic of a random variable X called her expected value:

M(X)=x 1 p 1 +x 2 p 2 +…+x n p n. (1.3.1)

Here x 1, x 2, …, x n– possible values ​​of the random variable X, A p 1, p 2, …, р n– their probabilities.

Example 1. Find the mathematical expectation of a random variable if its distribution law is known:

Solution. M(X)=2×0.3+3×0.1+5×0.6=3.9.

Example 2. Find the mathematical expectation of the number of occurrences of an event A in one trial, if the probability of this event is equal R.

Solution. If X– number of occurrences of the event A in one test, then, obviously, the distribution law X has the form:

Then M(X)=0×(1–р)+1×р=р.

So: the mathematical expectation of the number of occurrences of an event in one trial is equal to its probability.

Probabilistic meaning of mathematical expectation

Let it be produced n tests in which the random variable X accepted m 1 times value x 1, m 2 times value x 2, …, m k times value x k. Then the sum of all values ​​in n tests is equal to:

x 1 m 1 +x 2 m 2 +…+x k m k.

Let's find the arithmetic mean of all values ​​taken by the random variable:

Values ​​– relative frequencies of occurrence of values x i (i=1, …, k). If n big enough (n®¥), then these frequencies are approximately equal to the probabilities: . But then

=x 1 p 1 +x 2 p 2 +…+x k p k =M(X).

Thus, the mathematical expectation is approximately equal (the more accurately, the greater the number of tests) to the arithmetic mean of the observed values ​​of the random variable. This is the probabilistic meaning of mathematical expectation.

Properties of mathematical expectation

1. The mathematical expectation of a constant is equal to the constant itself.

M(C)=C×1=C.

2. The constant factor can be taken out of the mathematical expectation sign

M(CX)=C×M(X).

Proof. Let the distribution law X given by the table:

Then the random variable CX takes values Cx 1, Cx 2, …, Сх n with the same probabilities, i.e. distribution law CX has the form:

M(СХ)=Сх 1 ×р 1 +Сх 2 ×р 2 +…+Сх n ×p n =

=C(x 1 p 1 +x 2 p 2 +…+x n p n)=CM(X).

3. The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

M(XY)=M(X)×M(Y).

This statement is given without proof (the proof is based on the definition of mathematical expectation).

Consequence. The mathematical expectation of the product of several mutually independent random variables is equal to the product of their mathematical expectations.

In particular, for three independent random variables

M(XYZ)=M(X)×M(Y)×M(Z).

Example. Find the mathematical expectation of the product of the number of points that can appear when throwing two dice.

Solution. Let X i– number of points per i th bones. It could be numbers 1 , 2 , …, 6 with probabilities. Then

M(X i)=1× +2× +…+6× = (1+2+…+6)= × ×6= .

Let X=X 1 ×X 2. Then

M(X)=M(X 1)×M(X 2)= =12.25.

4. The mathematical expectation of the sum of two random variables (independent or dependent) is equal to the sum of the mathematical expectations of the terms:

M(X+Y)=M(X)+M(Y).

This property is generalized to the case of an arbitrary number of terms.

Example. 3 shots are fired with probabilities of hitting the target equal to p 1 =0.4, p 2 =0.3 And p 3 =0.6. Find the mathematical expectation of the total number of hits.

Solution. Let X i– number of hits at i-th shot. Then

М(Х i)=1×p i +0×(1–p i)=p i.

Thus,

M(X 1 +X 2 +X 3)= =0.4+0.3+0.6=1.3.

Characteristics of DSVs and their properties. Expectation, variance, standard deviation

The distribution law fully characterizes the random variable. However, when it is impossible to find the distribution law, or this is not required, you can limit yourself to finding values ​​called numerical characteristics of a random variable. These values ​​determine some average value around which the values ​​of the random variable are grouped, and the degree to which they are scattered around this average value.

Mathematical expectation A discrete random variable is the sum of the products of all possible values ​​of the random variable and their probabilities.

The mathematical expectation exists if the series on the right side of the equality converges absolutely.

From the point of view of probability, we can say that the mathematical expectation is approximately equal to the arithmetic mean of the observed values ​​of the random variable.

Example. The law of distribution of a discrete random variable is known. Find the mathematical expectation.

X
p 0.2 0.3 0.1 0.4

Solution:

9.2 Properties of mathematical expectation

1. The mathematical expectation of a constant value is equal to the constant itself.

2. The constant factor can be taken out as a sign of the mathematical expectation.

3. The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations.

This property is true for an arbitrary number of random variables.

4. The mathematical expectation of the sum of two random variables is equal to the sum of the mathematical expectations of the terms.

This property is also true for an arbitrary number of random variables.

Let n independent trials be performed, the probability of occurrence of event A in which is equal to p.

Theorem. The mathematical expectation M(X) of the number of occurrences of event A in n independent trials is equal to the product of the number of trials and the probability of the occurrence of the event in each trial.

Example. Find the mathematical expectation of the random variable Z if the mathematical expectations of X and Y are known: M(X)=3, M(Y)=2, Z=2X+3Y.

Solution:

9.3 Dispersion of a discrete random variable

However, the mathematical expectation cannot fully characterize the random process. In addition to the mathematical expectation, it is necessary to enter a value that characterizes the deviation of the values ​​of the random variable from the mathematical expectation.

This deviation is equal to the difference between the random variable and its mathematical expectation. In this case, the mathematical expectation of the deviation is zero. This is explained by the fact that some possible deviations are positive, others are negative, and as a result of their mutual cancellation, zero is obtained.



Dispersion (scattering) of a discrete random variable is the mathematical expectation of the squared deviation of the random variable from its mathematical expectation.

In practice, this method of calculating variance is inconvenient, because leads to cumbersome calculations for a large number of random variable values.

Therefore, another method is used.

Theorem. The variance is equal to the difference between the mathematical expectation of the square of the random variable X and the square of its mathematical expectation.

Proof. Taking into account the fact that the mathematical expectation M(X) and the square of the mathematical expectation M2(X) are constant quantities, we can write:

Example. Find the variance of a discrete random variable given by the distribution law.

X
X 2
R 0.2 0.3 0.1 0.4

Solution: .

9.4 Dispersion properties

1. The variance of a constant value is zero. .

2. The constant factor can be taken out of the dispersion sign by squaring it. .

3. The variance of the sum of two independent random variables is equal to the sum of the variances of these variables. .

4. The variance of the difference between two independent random variables is equal to the sum of the variances of these variables. .

Theorem. The variance of the number of occurrences of event A in n independent trials, in each of which the probability p of the occurrence of the event is constant, is equal to the product of the number of trials by the probabilities of the occurrence and non-occurrence of the event in each trial.

9.5 Standard deviation of a discrete random variable

Standard deviation random variable X is called the square root of the variance.

Theorem. The standard deviation of the sum of a finite number of mutually independent random variables is equal to the square root of the sum of the squares of the standard deviations of these variables.

Magnitude

Basic numerical characteristics of random

The density distribution law characterizes a random variable. But often it is unknown, and one has to limit oneself to less information. Sometimes it is even more profitable to use numbers that describe a random variable in total. Such numbers are called numerical characteristics random variable. Let's look at the main ones.

Definition:The mathematical expectation M(X) of a discrete random variable is the sum of the products of all possible values ​​of this quantity and their probabilities:

If a discrete random variable X takes a countably many possible values, then

Moreover, the mathematical expectation exists if this series is absolutely convergent.

From the definition it follows that M(X) a discrete random variable is a non-random (constant) variable.

Example: Let X– number of occurrences of the event A in one test, P(A) = p. We need to find the mathematical expectation X.

Solution: Let's create a tabular distribution law X:

X 0 1
P 1 - p p

Let's find the mathematical expectation:

Thus, the mathematical expectation of the number of occurrences of an event in one trial is equal to the probability of this event.

Origin of the term expected value associated with the initial period of the emergence of probability theory (XVI-XVII centuries), when the scope of its application was limited to gambling. The player was interested in the average value of the expected win, i.e. mathematical expectation of winning.

Let's consider probabilistic meaning of mathematical expectation.

Let it be produced n tests in which the random variable X accepted m 1 times value x 1, m 2 times value x 2, and so on, and finally she accepted m k times value x k, and m 1 + m 2 +…+ + m k = n.

Then the sum of all values ​​taken by the random variable X, is equal x 1 m 1 +x 2 m 2 +…+x k m k.

Arithmetic mean of all values ​​taken by a random variable X,equals:

since is the relative frequency of a value for any value i = 1, …, k.

As is known, if the number of tests n is sufficiently large, then the relative frequency is approximately equal to the probability of the event occurring, therefore,

Thus, .

Conclusion:The mathematical expectation of a discrete random variable is approximately equal (the more accurately, the greater the number of tests) to the arithmetic mean of the observed values ​​of the random variable.

Let's consider the basic properties of mathematical expectation.

Property 1:The mathematical expectation of a constant value is equal to the constant value itself:

M(C) = C.

Proof: Constant WITH can be considered , which has one possible meaning WITH and accepts it with probability p = 1. Hence, M(C) = C 1= S.



Let's define product of a constant variable C and a discrete random variable X as a discrete random variable CX, the possible values ​​of which are equal to the products of the constant WITH to possible values X CX equal to the probabilities of the corresponding possible values X:

CX C C C
X
R

Property 2:The constant factor can be taken out of the mathematical expectation sign:

M(CX) = CM(X).

Proof: Let the random variable X is given by the law of probability distribution:

X
P

Let's write the law of probability distribution of a random variable CX:

CX C C C
P

M(CX) = C +C =C + ) = C M(X).

Definition:Two random variables are called independent if the distribution law of one of them does not depend on what possible values ​​the other variable took. Otherwise, the random variables are dependent.

Definition:Several random variables are said to be mutually independent if the distribution laws of any number of them do not depend on what possible values ​​the remaining variables took.

Let's define product of independent discrete random variables X and Y as a discrete random variable XY, the possible values ​​of which are equal to the products of each possible value X for every possible value Y. Probabilities of possible values XY are equal to the products of the probabilities of possible values ​​of the factors.

Let the distributions of random variables be given X And Y:

X
P
Y
G

Then the distribution of the random variable XY has the form:

XY
P

Some works may be equal. In this case, the probability of a possible value of the product is equal to the sum of the corresponding probabilities. For example, if = , then the probability of the value is

Property 3:The mathematical expectation of the product of two independent random variables is equal to the product of their mathematical expectations:

M(XY) = M(X) M(Y).

Proof: Let independent random variables X And Y are specified by their own probability distribution laws:

X
P
Y
G

To simplify the calculations, we will limit ourselves to a small number of possible values. In the general case the proof is similar.

Let's create a law of distribution of a random variable XY:

XY
P

M(XY) =

M(X) M(Y).

Consequence:The mathematical expectation of the product of several mutually independent random variables is equal to the product of their mathematical expectations.

Proof: Let us prove for three mutually independent random variables X,Y,Z. Random variables XY And Z independent, then we get:

M(XYZ) = M(XY Z) = M(XY) M(Z) = M(X) M(Y) M(Z).

For an arbitrary number of mutually independent random variables, the proof is carried out by the method of mathematical induction.

Example: Independent random variables X And Y

X 5 2
P 0,6 0,1 0,3
Y 7 9
G 0,8 0,2

Need to find M(XY).

Solution: Since random variables X And Y are independent, then M(XY)=M(X) M(Y)=(5 0,6+2 0,1+4 0,3) (7 0,8+9 0,2)= 4,4 7,4 = =32,56.

Let's define sum of discrete random variables X and Y as a discrete random variable X+Y, the possible values ​​of which are equal to the sums of each possible value X with every possible value Y. Probabilities of possible values X+Y for independent random variables X And Y are equal to the products of the probabilities of the terms, and for dependent random variables - to the products of the probability of one term by the conditional probability of the second.

If = and the probabilities of these values ​​are respectively equal, then the probability (the same as ) is equal to .

Property 4:The mathematical expectation of the sum of two random variables (dependent or independent) is equal to the sum of the mathematical expectations of the terms:

M(X+Y) = M(X) + M(Y).

Proof: Let two random variables X And Y are given by the following distribution laws:

X
P
Y
G

To simplify the conclusion, we will limit ourselves to two possible values ​​of each quantity. In the general case the proof is similar.

Let's compose all possible values ​​of a random variable X+Y(assume, for simplicity, that these values ​​are different; if not, then the proof is similar):

X+Y
P

Let's find the mathematical expectation of this value.

M(X+Y) = + + + +

Let's prove that + = .

Event X = ( its probability P(X = ) entails the event that the random variable X+Y will take the value or (the probability of this event, according to the addition theorem, is equal to ) and vice versa. Then = .

The equalities = = = are proved in a similar way

Substituting the right-hand sides of these equalities into the resulting formula for the mathematical expectation, we obtain:

M(X + Y) = + ) = M(X) + M(Y).

Consequence:The mathematical expectation of the sum of several random variables is equal to the sum of the mathematical expectations of the terms.

Proof: Let us prove for three random variables X,Y,Z. Let's find the mathematical expectation of random variables X+Y And Z:

M(X+Y+Z)=M((X+Y Z)=M(X+Y) M(Z)=M(X)+M(Y)+M(Z)

For an arbitrary number of random variables, the proof is carried out by the method of mathematical induction.

Example: Find the average of the sum of the number of points that can be obtained when throwing two dice.

Solution: Let X– the number of points that can appear on the first die, Y- On the second. It is obvious that random variables X And Y have the same distributions. Let's write down the distribution data X And Y into one table:

X 1 2 3 4 5 6
Y 1 2 3 4 5 6
P 1/6 1/6 1/6 1/6 1/6 1/6

M(X) = M(Y) (1+2+3+4+5+6) = =

M(X + Y) = 7.

So, the average value of the sum of the number of points that can appear when throwing two dice is 7 .

Theorem:The mathematical expectation M(X) of the number of occurrences of event A in n independent trials is equal to the product of the number of trials and the probability of the occurrence of the event in each trial: M(X) = np.

Proof: Let X– number of occurrences of the event A V n independent tests. Obviously the total number X occurrences of the event A in these trials is the sum of the number of occurrences of the event in individual trials. Then, if the number of occurrences of an event in the first trial, in the second, and so on, finally, is the number of occurrences of the event in n-th test, then the total number of occurrences of the event is calculated by the formula:

By property 4 of mathematical expectation we have:

M(X) = M( ) + … + M( ).

Since the mathematical expectation of the number of occurrences of an event in one trial is equal to the probability of the event, then

M( ) = M( )= … = M( ) = p.

Hence, M(X) = np.

Example: The probability of hitting the target when firing from a gun is p = 0.6. Find the average number of hits if made 10 shots.

Solution: The hit for each shot does not depend on the outcomes of other shots, therefore the events under consideration are independent and, therefore, the required mathematical expectation is equal to:

M(X) = np = 10 0,6 = 6.

So the average number of hits is 6.

Now consider the mathematical expectation of a continuous random variable.

Definition:The mathematical expectation of a continuous random variable X, the possible values ​​of which belong to the interval,called the definite integral:

where f(x) is the probability distribution density.

If possible values ​​of a continuous random variable X belong to the entire Ox axis, then

It is assumed that this improper integral converges absolutely, i.e. the integral converges If this requirement were not met, then the value of the integral would depend on the rate at which (separately) the lower limit tends to -∞, and the upper limit tends to +∞.

It can be proven that all properties of the mathematical expectation of a discrete random variable are preserved for a continuous random variable. The proof is based on the properties of definite and improper integrals.

It is obvious that the mathematical expectation M(X) greater than the smallest and less than the largest possible value of the random variable X. Those. on the number axis, possible values ​​of a random variable are located to the left and to the right of its mathematical expectation. In this sense, the mathematical expectation M(X) characterizes the location of the distribution and is therefore often called distribution center.