Electronics and Communication Engineering (ECE) Exam  >  Electronics and Communication Engineering (ECE) Questions  >   Let H(X) denote the entropy of a discrete ra... Start Learning for Free
Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?
  • a)
    H (X) ≤ log2 K bits
  • b)
    H (X) ≤ H(2X)
  • c)
    H (X) ≤ H(2X)
  • d)
    H (X) ≤ H(X2)
Correct answer is option 'A,B,C'. Can you explain this answer?
Most Upvoted Answer
Let H(X) denote the entropy of a discrete random variable taking ? p...
Solution:

  • Statement A: H(X) ≤ log2 K bits

  • This statement is necessarily true as the entropy of a discrete random variable taking K possible distinct real values cannot be greater than log2 K. This is because each value can be represented using log2 K bits and the entropy is the expected number of bits needed to represent the value. Therefore, H(X) ≤ log2 K bits.

  • Statement B: H(X) ≤ H(2X)

  • This statement is necessarily true as the entropy of a random variable cannot increase when the variable is multiplied by a constant. Therefore, H(X) ≤ H(2X).

  • Statement C: H(X) ≤ H(2X)

  • This statement is necessarily true as the entropy of a random variable cannot increase when the variable is multiplied by a constant. Therefore, H(X) ≤ H(2X).

  • Statement D: H(X) ≤ H(X2)

  • This statement is not necessarily true as the entropy of a random variable can increase when the variable is squared. For example, if X is a uniform random variable taking values {1, 2, 3}, then H(X) = log2 3 bits. However, if X2 is defined as the square of X, then the possible values are {1, 4, 9} and the entropy is H(X2) = log2 3 + 1 bit, which is greater than H(X).

Free Test
Community Answer
Let H(X) denote the entropy of a discrete random variable taking ? p...
Given
H (X) is entropy of a discrete random variable X taking K possible distinct real values.
Let variable X is taking values as xi so set of possible values is {x1, x2 , x3.......xk } .
The entropy will be,
Case-1 : When all values are equiprobable i.e. P(xi) = 1/k for each distinct ' K' values, then entropy will be given as,
H ( X ) = log K
When we choose base as 2,
H ( X ) = log2 K bits
We know that, in case of equal probability the entropy will be highest,
So, H(X) ≤ log2 K is always true for any base value.
Hence, option (A) is correct.
Case-2 : Assuming a new random variable Y = 2X which is mapped by random variable X as shown below,
So, mapping will be one to one as we have linear relation between X and Y.
The random variable X = {x1, x2 , x3.......xk } in the same way random variable Y
Y = {y1 = 2x1, y2 = 2x2 , .......... yK = 2xK }
Along with values the probability of occurring each value will also be mapped and hence xi and corresponding yi will have identical probability, i. e. P(xi) = P(yi)
We can say that P( x1 ) = P(y1), P(x2) = P(y2)..........P(xK) = P(yK)
Probability of random variable X and Y are same, because one to one mapping therefore, entropy of random variable X and Y are same. i. e. H (Y = 2X ) = H ( X )
So we will never have situation of H ( X ) < h="" (2x="" />
Hence, option (B) is correct
Case-3 : Assume new random variable Y = 2X , This Y = 2X will give one to one mapping because different values of X provide different values of Y so that probability of random variable Y is same as probabilities of random variable X so that entropy
H(Y = 2X) = H(X)
So, H ( X ) < h="" />X ) never occur
So, option (C) is also correct.
Case-4 : Assuming new random variable Y =X2 .
Possibility-01 : Suppose we have 3 positive values of random variable X is x1 = 1 , x2 = 2, x3 = 3 and assuming corresponding probabilities of x1, x2, x3 are respectively
i.e. X = {1, 2, 3}
Probability,
Thus, random variable, Y =X2= {1, 4, 9}
Here we can say different values of X gives different values of Y, it shows one to one mapping is possible between X and Y.
So, probabilities of random variable Y is same as probability of random variable X.
It means,
Thus we can say that entropy of H(X) = H(Y= X2 )
Possibility-02 : Suppose we have 3 negative values of random variable X is x1 = -1 , x2 = -2 , x3 = -3 and assuming corresponding probabilities of x1, x2, x3 are respectively
i.e. X = {-1, - 2, - 3}
Probability,
Thus, random variable, Y =X2= {1, 4, 9}
Here we can say different values of X gives different values of Y, it shows one to one mapping is possible between X and Y.
So, probabilities of random variable Y is same as probability of random variable X,
Thus we can say that entropy of H(X) = H(Y= X2) .
Possibility-03 : Suppose we have both positive and negative values of random varible X is x1 = -1 , x2 = 1, x3 = 2 and assuming corresponding probabilities of x1 , x2 and x3 are 1/5, 1/2 and 3/10.
i.e. X = {-1, 1, 2}
Thus random variable, Y =X2= {1, 1, 4}
Here we see different values of X gives same value of Y, it show one to one mapping is not possible here.
So, random variable Y have only 2-values i.e. Y = [1, 4]
So, probability of P(y1 = 1) is sum of probability of P(x1 =-1) and P(x2 = 1) so, P(y1 = 1) = 1/5 + 1/2 = 7/10
Probability of P(y2 = 4) remains same as probability of P(x3 = 2) .
i.e. P(y2 = 4) = P(x3 = 2) = 3/10
So entropy of H (Y) is
Entropy of H (X) is,
= 0.4643 + 0.5 + 0.521089
=1.4854
From Case-4, it is clear that H (X) = H (X)2 is possible only when random variable X have all positive or negative values and X can take combination of positive and negative values then H (X) > H (X2) that why option (D) is incorrect.
Hence, the correct options are (A), (B) & (C).
Explore Courses for Electronics and Communication Engineering (ECE) exam

Similar Electronics and Communication Engineering (ECE) Doubts

Top Courses for Electronics and Communication Engineering (ECE)

Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer?
Question Description
Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? for Electronics and Communication Engineering (ECE) 2025 is part of Electronics and Communication Engineering (ECE) preparation. The Question and answers have been prepared according to the Electronics and Communication Engineering (ECE) exam syllabus. Information about Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? covers all topics & solutions for Electronics and Communication Engineering (ECE) 2025 Exam. Find important definitions, questions, meanings, examples, exercises and tests below for Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer?.
Solutions for Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? in English & in Hindi are available as part of our courses for Electronics and Communication Engineering (ECE). Download more important topics, notes, lectures and mock test series for Electronics and Communication Engineering (ECE) Exam by signing up for free.
Here you can find the meaning of Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? defined & explained in the simplest way possible. Besides giving the explanation of Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer?, a detailed solution for Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? has been provided alongside types of Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? theory, EduRev gives you an ample number of questions to practice Let H(X) denote the entropy of a discrete random variable taking ? possible distinct real values. Which of the following statements is/are necessarily true?a)H (X) ≤ log2 K bitsb)H (X) ≤ H(2X)c)H (X) ≤ H(2X)d)H (X) ≤ H(X2)Correct answer is option 'A,B,C'. Can you explain this answer? tests, examples and also practice Electronics and Communication Engineering (ECE) tests.
Explore Courses for Electronics and Communication Engineering (ECE) exam

Top Courses for Electronics and Communication Engineering (ECE)

Explore Courses
Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev