Mathematics Exam  >  Mathematics Notes  >  Mathematics for IIT JAM, GATE, CSIR NET, UGC NET  >  Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET PDF Download

 The Markov and Chebyshev inequalities

As you’ve probably seen in today’s front page: the upper tenth percentile earns 12 times more than the average salary. The following theorem will show that this is not possible.

Theorem 6.1 (Markov inequality) Let X be a random variable assuming nonnegative values. Then for every a ≥ 0,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Comment: Note first that this is a vacuous statement for a < E[X]. For a > E[X] this inequality limits the probability that X assumes values larger than its mean.
This is the first time in this course that we derive an inequality. Inequalities, in general, are an important tool for analysis, where estimates (rather than exact identities) are needed.

Proof : We will assume a continuous variable. A similar proof holds in the discrete case.

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Theorem 6.2 (Chebyshev inequality) Let X be a random variable with mean value µ and variance σ2. Then, for every a > 0

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Comment: If we write a = k σ, this theorem states that the probability that a random variable assumes a value whose absolute distance from its mean is more that k times its standard deviation is less that 1/k2 .
The notable thing about these inequalities is that they make no assumption about the distribution. As a result, of course, they may provide very loose estimates.

Proof : Since |X − µ|2 is a positive variable we may apply the Markov inequality,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Example: On average, an alcoholic drinks 25 liters of wine every week. What is the probability that he drinks more than 50 liters of wine on a given week?
Here we apply the Markov inequality. If X is the amount of wine he drinks on a given week, then

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Example: Let X ∼ U (0, 10) what is the probability that |X − E[X]| > 4?
Since E[X] = 5, the answer is 0.2. The Chebyshev inequality, on the other hand gives,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET


Jensen’s inequality 

Recall that a function f : R → R is called convex if it is always below its secants, i.e., if for every x, y and 0 < λ< 1,

f (λ x + (1 − λ)y) ≤ λ f ( x) + (1 − λ) f (y).

Proposition 6.1 (Jensen’s inequality) If g is a convex real valued function and X is a real valued random variable, then

g(E[X ]) ≤ E[g(X )],

provided that the expectations exist.

Proof : Let’s start with an easy proof for a particular case. Consider a continuous random variable and assume that g is twice differentiable (therefore g((( x) ≥ 0).
Taylor expanding g about µ = E[X]

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Multiplying both sides by the non-negative functions fX and integrating over x we get

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

which is precisely what we need to show.

What about the more general case? Any convex function is continuous, and has one-sided derivatives with

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

For every m ∈ [g'(−(µ), g'(+(µ)]

g( x) ≥ g(µ) + m( x − µ),

so the same proof holds with m replacing g((µ). If X is a discrete variable, we use summation instead of integration.

Example: Since exp is a convex function,

exp(t E[X ]) ≤ E[etX ] = MX (t),

or,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

for all t > 0.

Example: Consider a discrete random variable assuming the positive values x1, . . . , xn with equal probability 1/n. Jensen’s inequality for g( x) = − log( x) gives,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Reversing signs, and exponentiating, we get

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

which is the classical arithmetic mean - geometric mean inequality. In fact, this inequality can be generalized for arbitrary distributions, pX ( xi ) = p, yielding

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET


Kolmogorov’s inequality

The Kolmogorov inequality may first seem to be of similar flavor as Chebyshev’s inequality, but it is considerably stronger. I have decided to include it here because its proof involves some interesting subtleties. First, a lemma:

Lemma 6.1 If X, Y, Z are random variables such that Y is independent of X and Z , then
E[XY |Z ] = E[X|Z ] E[Y ].

Proof : The fact that Y is independent of both X, Z implies that (in the case of discrete variables), pX,Y,Z ( x, y, z) = pX,Z ( x, z) p(y).
Now, for every z,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Theorem 6.3 (Kolmogorov’s inequality) Let X1, . . . , Xn be independent random variables such that Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET. Then, for all a > 0,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Comment: For n = 1 this is nothing but the Chebyshev inequality. For n > 1 it would still be Chebyshev’s inequality if the maximum over 1 ≤ k ≤ n was replaced by k = n, since by independence

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Proof : We introduce the notation S k = X1 + · · · + Xk . This theorem is concerned with the probability that |Sk | > a for some k. We define the random variable N (ω) to be the smallest integer k for which|Sk| > a; if there is no such number we set N(ω) = n. We observe the equivalence of events, 

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

and from the Markov inequality

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

We need to estimate the righth and side. If we could replace

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

then we would be done. The trick is to show that Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET by using conditional expectations. If

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

for all 1 ≤ k ≤ n then the inequality holds, since we have then an inequality between random variables  Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET and applying expectations on both sides gives the desired result.

For k = n, the identity

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

hold trivially. Otherwise, we write

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

The first term on the right hand side equals Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET, whereas the second terms is non-negative. Remains the third term for which we remark that Xk+1 + · · · + Xn is independent of both S k and N , and by the previous lemma,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Putting it all together,

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

Since this holds for all k’s we have thus shown that

Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

which completes the proof.

The document Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences | Mathematics for IIT JAM, GATE, CSIR NET, UGC NET is a part of the Mathematics Course Mathematics for IIT JAM, GATE, CSIR NET, UGC NET.
All you need of Mathematics at this link: Mathematics
556 videos|198 docs

FAQs on Probability inequalities (chebyshev, Markov, Jensen), CSIR-NET Mathematical Sciences - Mathematics for IIT JAM, GATE, CSIR NET, UGC NET

1. What are probability inequalities and how are they used in mathematical sciences?
Ans. Probability inequalities are mathematical tools that provide bounds on the probabilities of certain events or random variables. They are used to quantify the likelihood of events or outcomes, and to establish relationships between different probabilistic quantities. In mathematical sciences, probability inequalities have various applications, such as in analyzing the behavior of random processes, estimating the convergence rates of algorithms, and assessing the reliability of statistical estimators.
2. What is Chebyshev's inequality and how is it applied in probability theory?
Ans. Chebyshev's inequality is a fundamental probability inequality that relates the probability of a random variable deviating from its mean to the variance of the random variable. It states that for any random variable with finite variance, the probability that the random variable deviates from its mean by more than a certain distance is bounded by the ratio of the variance to the square of that distance. This inequality is often used in probability theory to establish bounds on the tails of a distribution and to quantify the concentration of a random variable around its mean.
3. How does Markov's inequality provide an upper bound on the probability of a random variable exceeding a certain threshold?
Ans. Markov's inequality is a probability inequality that provides an upper bound on the probability of a non-negative random variable exceeding a certain threshold. It states that for any non-negative random variable X and any positive constant c, the probability that X is greater than or equal to c is bounded by the expectation of X divided by c. This inequality is useful when the exact distribution of the random variable is unknown or difficult to compute, as it allows for the estimation of the probability of large deviations from the mean based solely on the first moment of the random variable.
4. How does Jensen's inequality relate to probability theory and convex functions?
Ans. Jensen's inequality is a powerful inequality that relates the expectation of a function of a random variable to the function of the expectation of the random variable. In the context of probability theory, Jensen's inequality is often used to establish bounds on expectations and to quantify the convexity or concavity of certain functions. The inequality states that for any random variable X and any convex function f, the expectation of f(X) is greater than or equal to f applied to the expectation of X. This inequality provides a bridge between the properties of a function and the probabilistic behavior of a random variable.
5. Can probability inequalities be used to analyze the convergence rates of algorithms in mathematical sciences?
Ans. Yes, probability inequalities can be used to analyze the convergence rates of algorithms in mathematical sciences. By bounding the probabilities of certain events or deviations, probability inequalities provide insights into the behavior and convergence properties of algorithms. For example, Chebyshev's inequality can be used to assess the rate of convergence of an iterative algorithm by bounding the probability that the algorithm's estimate deviates from the true value. Similarly, Markov's inequality can be applied to analyze the convergence of stochastic algorithms by bounding the probability of large deviations from the desired solution.
556 videos|198 docs
Download as PDF
Explore Courses for Mathematics exam
Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev
Related Searches

ppt

,

Markov

,

Exam

,

Probability inequalities (chebyshev

,

GATE

,

Previous Year Questions with Solutions

,

shortcuts and tricks

,

Jensen)

,

Important questions

,

UGC NET

,

Summary

,

mock tests for examination

,

Free

,

practice quizzes

,

Probability inequalities (chebyshev

,

CSIR NET

,

Extra Questions

,

Markov

,

CSIR-NET Mathematical Sciences | Mathematics for IIT JAM

,

Jensen)

,

Objective type Questions

,

CSIR-NET Mathematical Sciences | Mathematics for IIT JAM

,

Sample Paper

,

MCQs

,

video lectures

,

Probability inequalities (chebyshev

,

UGC NET

,

GATE

,

Viva Questions

,

CSIR NET

,

CSIR NET

,

past year papers

,

UGC NET

,

pdf

,

CSIR-NET Mathematical Sciences | Mathematics for IIT JAM

,

Semester Notes

,

GATE

,

Jensen)

,

Markov

,

study material

;