The differences between Chebyshev’s Inequality and Markov's Inequality are given below:
Table of Contents
What Is Chebyshev's Inequality?
Chebyshev's inequality enables one to understand the probability of values falling close to the mean without a normal distribution. It is one of the most generally used common inequalities in probability theory and is also known as the Bienaymé-Chebyshev inequality.

The idea of inequality is used when sample space is unknown but mean and variance values are known. These two can help find the limit for the probability values. Even if the exact probability limit cannot be known, this inequality helps find the upper and lower limits of probability.
Key Takeaways
- Chebyshev's inequality shows the minimum percentage of values within the range of the specific variable. The inequality uses two parameters to find answers: finite mean and finite variance.
- It is named after P. L. Chebyshev, a Russian mathematician from the nineteenth century.
- It asserts that no more than a specified percentage of values (1/k2) will deviate from the distribution's average by more than a specified number of standard deviations (k).
- The application of Chebyshev's inequality helps in describing the characteristics of probability distributions. It is also helpful for grouping data into clusters and identifying outliers.
Chebyshev's Inequality Explained
Chebyshev's inequality shows the minimum percentage of values within the range of the specific variable. The inequality uses two parameters to find answers: finite mean and finite variance. It is named after P. L. Chebyshev, a Russian mathematician who lived in the nineteenth century. This inequality method is very useful as it can be used on any probability distribution where the mean and variance are known. The application of chebyshev's inequality helps in describing the characteristics of probability distributions. It is also helpful for grouping data into clusters and identifying outliers.
It asserts that no more than a specified percentage of values (1/k2) will deviate from the distribution's average by more than a specified number of standard deviations (k). Given that the mean and variance are known, the theorem can analyze any probability distribution, including those that defy normality. The representative equation is:
Pr(çC-mç³k´s)£
According to the formula, the probability that a value represented by "X" would deviate by more than k standard deviations "𝜎 from the mean value "𝜇" is less than or equal to the value of The term k in the formula can take any value above zero. It denotes a distance from the mean given in standard deviation units.
Formula
This formula can solve Chebyshev's inequality problems:
Chebyshev’s inequality= Pr(çC-mç³k´s)£
In the formula, X is taken as a random variable, s = variance, 𝜇= = mean value and k >0.
Examples
Example #1
Let us take the values of E(X) = 13, Variance (X) = 25 and bound on P(5 < X < 21).
Since the mean is 13, the following values result from applying the bounds (5-13 and 21-13).
P (5 < X < 21) = P(|X − 13| < 8) = 1 − P(|X − 13| ≥ 8).
Using Chebyshevs formula Pr(çC-mç³k´s)£
And taking k=1.6. Selecting the bound and substituting the values P (|X − 13|≥ 8)£ =0.391
Implying that P(5 < X < 21) ≥1-0.391=0.6.
Example #2
Chebyshev's inequality was used to determine sample size in a population's fingerprint data's biometric evaluation by the NIST (National Institute of Standards and Technology).
The sample size for biometric applications was established using Chebyshev's inequality and simple random sampling. This method of inequality was chosen for two major reasons.
- As long as the mean and variance are present, Chebyshev's inequality is valid regardless of any assumptions about the population distribution's shape.
- Using this inequality, the lower bound on the data set's probability that observation values of the random variable fall within a given interval can be calculated. This interval is one standard deviation or more away from the population mean.
Chebyshev's Inequality Vs. Markov's Inequality
Key Points | Chebyshev’s Inequality | Markov's Inequality |
---|---|---|
1. Concept | Chebyshev's inequality shows the minimum percentage of values within the range of the specific variable. | Markov's inequality provides an upper bound for the probability that a random variable's non-negative function is greater than or equal to a positive constant. |
2. Requirements | It requires knowing the values of the variance in addition to the mean. | Markov Inequality needs only the mean value to find the probability and a random variable that is not negative. |
3. Strength of the Bound | It estimates a stronger bound than Markov's inequality. Chebyshev's inequality often provides far better bounds than Markov's inequality and asks about deviation from the mean in both directions. It can be used to establish a bound on how frequently a random variable can take high values, and knowing the variance gives better control over deviation from the mean. | Markov Inequality has a weaker bound than chebyshev’s. |