GATE Exam  >  GATE Questions  >  Let U and V be two independent and identicall... Start Learning for Free
Let U and V be two independent and identically distributed random variables such
that P(U= 1+) = P(U= -1)=1/2 The entropy H(U + V) in bits is
  • a)
    3/4
  • b)
    1
  • c)
    3/2
  • d)
    log23
Correct answer is option 'C'. Can you explain this answer?
Verified Answer
Let U and V be two independent and identically distributed random vari...
View all questions of this test
Most Upvoted Answer
Let U and V be two independent and identically distributed random vari...
Solution:

Given,
- U and V are two independent and identically distributed random variables
- P(U=1) = P(U=-1) = 1/2

To find: Entropy H(U,V) in bits

Entropy formula:
H(X) = - Σ p(x) log2 p(x)

Joint probability distribution of U and V:
P(U,V) = P(U) x P(V) [since they are independent]
= (1/2) x (1/2)
= 1/4

Using the joint probability distribution P(U,V), we can create a table of possible values of U and V and their probabilities:

| U | V | P(U,V) |
|---|---|--------|
| 1 | 1 | 1/4 |
| 1 | -1| 1/4 |
| -1| 1 | 1/4 |
| -1| -1| 1/4 |

Now, we can calculate the marginal probabilities of U and V:

P(U=1) = P(U=-1) = 1/2
P(V=1) = P(V=-1) = 1/2

Using the marginal probabilities, we can create a table of possible values of U and V and their marginal probabilities:

| U | P(U) |
|---|------|
| 1 | 1/2 |
| -1| 1/2 |

| V | P(V) |
|---|------|
| 1 | 1/2 |
| -1| 1/2 |

Now, we can calculate the entropy of U and V:

H(U) = - [ P(U=1) log2 P(U=1) + P(U=-1) log2 P(U=-1) ]
= - [ (1/2) log2 (1/2) + (1/2) log2 (1/2) ]
= - [ (-1/2) + (-1/2) ]
= 1 bit

H(V) = - [ P(V=1) log2 P(V=1) + P(V=-1) log2 P(V=-1) ]
= - [ (1/2) log2 (1/2) + (1/2) log2 (1/2) ]
= - [ (-1/2) + (-1/2) ]
= 1 bit

Now, we can calculate the joint entropy of U and V:

H(U,V) = - Σ Σ P(U,V) log2 P(U,V)
= - [ (1/4) log2 (1/4) + (1/4) log2 (1/4) + (1/4) log2 (1/4) + (1/4) log2 (1/4) ]
= - [ (-2) + (-2) + (-2) + (-2) ] / 4
= 2 bits

Therefore, the entropy H(U,V) in bits is 2 bits.

Since U and V are identically distributed, we can also write:
H(U,V) = H(U) + H(V) - I(U,V)

where I(U,V) is the mutual information between U and V.

Since U
Explore Courses for GATE exam
Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer?
Question Description
Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? for GATE 2024 is part of GATE preparation. The Question and answers have been prepared according to the GATE exam syllabus. Information about Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? covers all topics & solutions for GATE 2024 Exam. Find important definitions, questions, meanings, examples, exercises and tests below for Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer?.
Solutions for Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? in English & in Hindi are available as part of our courses for GATE. Download more important topics, notes, lectures and mock test series for GATE Exam by signing up for free.
Here you can find the meaning of Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? defined & explained in the simplest way possible. Besides giving the explanation of Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer?, a detailed solution for Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? has been provided alongside types of Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? theory, EduRev gives you an ample number of questions to practice Let U and V be two independent and identically distributed random variables suchthat P(U= 1+) =P(U= -1)=1/2 The entropy H(U + V) in bits isa)3/4b)1c)3/2d)log23Correct answer is option 'C'. Can you explain this answer? tests, examples and also practice GATE tests.
Explore Courses for GATE exam
Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev