Let U and V be two independent and identically distributed random vari...
Solution:
Given,
- U and V are two independent and identically distributed random variables
- P(U=1) = P(U=-1) = 1/2
To find: Entropy H(U,V) in bits
Entropy formula:
H(X) = - Σ p(x) log2 p(x)
Joint probability distribution of U and V:
P(U,V) = P(U) x P(V) [since they are independent]
= (1/2) x (1/2)
= 1/4
Using the joint probability distribution P(U,V), we can create a table of possible values of U and V and their probabilities:
| U | V | P(U,V) |
|---|---|--------|
| 1 | 1 | 1/4 |
| 1 | -1| 1/4 |
| -1| 1 | 1/4 |
| -1| -1| 1/4 |
Now, we can calculate the marginal probabilities of U and V:
P(U=1) = P(U=-1) = 1/2
P(V=1) = P(V=-1) = 1/2
Using the marginal probabilities, we can create a table of possible values of U and V and their marginal probabilities:
| U | P(U) |
|---|------|
| 1 | 1/2 |
| -1| 1/2 |
| V | P(V) |
|---|------|
| 1 | 1/2 |
| -1| 1/2 |
Now, we can calculate the entropy of U and V:
H(U) = - [ P(U=1) log2 P(U=1) + P(U=-1) log2 P(U=-1) ]
= - [ (1/2) log2 (1/2) + (1/2) log2 (1/2) ]
= - [ (-1/2) + (-1/2) ]
= 1 bit
H(V) = - [ P(V=1) log2 P(V=1) + P(V=-1) log2 P(V=-1) ]
= - [ (1/2) log2 (1/2) + (1/2) log2 (1/2) ]
= - [ (-1/2) + (-1/2) ]
= 1 bit
Now, we can calculate the joint entropy of U and V:
H(U,V) = - Σ Σ P(U,V) log2 P(U,V)
= - [ (1/4) log2 (1/4) + (1/4) log2 (1/4) + (1/4) log2 (1/4) + (1/4) log2 (1/4) ]
= - [ (-2) + (-2) + (-2) + (-2) ] / 4
= 2 bits
Therefore, the entropy H(U,V) in bits is 2 bits.
Since U and V are identically distributed, we can also write:
H(U,V) = H(U) + H(V) - I(U,V)
where I(U,V) is the mutual information between U and V.
Since U