Let H(X) denote the entropy of a discrete random variable taking ? p...
Given
H (X) is entropy of a discrete random variable X taking K possible distinct real values.
Let variable X is taking values as xi so set of possible values is {x1, x2 , x3.......xk } .
The entropy will be,
Case-1 : When all values are equiprobable i.e. P(xi) = 1/k for each distinct ' K' values, then entropy will be given as,
H ( X ) = log K
When we choose base as 2,
H ( X ) = log2 K bits
We know that, in case of equal probability the entropy will be highest,
So, H(X) ≤ log2 K is always true for any base value.
Hence, option (A) is correct.
Case-2 : Assuming a new random variable Y = 2X which is mapped by random variable X as shown below,
So, mapping will be one to one as we have linear relation between X and Y.
The random variable X = {x1, x2 , x3.......xk } in the same way random variable Y
Y = {y1 = 2x1, y2 = 2x2 , .......... yK = 2xK }
Along with values the probability of occurring each value will also be mapped and hence xi and corresponding yi will have identical probability, i. e. P(xi) = P(yi)
We can say that P( x1 ) = P(y1), P(x2) = P(y2)..........P(xK) = P(yK)
Probability of random variable X and Y are same, because one to one mapping therefore, entropy of random variable X and Y are same. i. e. H (Y = 2X ) = H ( X )
So we will never have situation of H ( X ) < h="" (2x="" />
Hence, option (B) is correct
Case-3 : Assume new random variable Y = 2X , This Y = 2X will give one to one mapping because different values of X provide different values of Y so that probability of random variable Y is same as probabilities of random variable X so that entropy
H(Y = 2X) = H(X)
So, H ( X ) < h="" />X ) never occur
So, option (C) is also correct.
Case-4 : Assuming new random variable Y =X2 .
Possibility-01 : Suppose we have 3 positive values of random variable X is x
1 = 1 , x
2 = 2, x
3 = 3 and assuming corresponding probabilities of x
1, x
2, x
3 are
respectively
i.e. X = {1, 2, 3}
Probability,
Thus, random variable, Y =X2= {1, 4, 9}
Here we can say different values of X gives different values of Y, it shows one to one mapping is possible between X and Y.
So, probabilities of random variable Y is same as probability of random variable X.
It means,
Thus we can say that entropy of H(X) = H(Y= X2 )
Possibility-02 : Suppose we have 3 negative values of random variable X is x
1 = -1 , x
2 = -2 , x
3 = -3 and assuming corresponding probabilities of x
1, x
2, x
3 are
respectively
i.e. X = {-1, - 2, - 3}
Probability,
Thus, random variable, Y =X2= {1, 4, 9}
Here we can say different values of X gives different values of Y, it shows one to one mapping is possible between X and Y.
So, probabilities of random variable Y is same as probability of random variable X,
Thus we can say that entropy of H(X) = H(Y= X2) .
Possibility-03 : Suppose we have both positive and negative values of random varible X is x1 = -1 , x2 = 1, x3 = 2 and assuming corresponding probabilities of x1 , x2 and x3 are 1/5, 1/2 and 3/10.
i.e. X = {-1, 1, 2}
Thus random variable, Y =X2= {1, 1, 4}
Here we see different values of X gives same value of Y, it show one to one mapping is not possible here.
So, random variable Y have only 2-values i.e. Y = [1, 4]
So, probability of P(y1 = 1) is sum of probability of P(x1 =-1) and P(x2 = 1) so, P(y1 = 1) = 1/5 + 1/2 = 7/10
Probability of P(y2 = 4) remains same as probability of P(x3 = 2) .
i.e. P(y2 = 4) = P(x3 = 2) = 3/10
So entropy of H (Y) is
Entropy of H (X) is,
= 0.4643 + 0.5 + 0.521089
=1.4854
From Case-4, it is clear that H (X) = H (X)2 is possible only when random variable X have all positive or negative values and X can take combination of positive and negative values then H (X) > H (X2) that why option (D) is incorrect.
Hence, the correct options are (A), (B) & (C).