Electronics and Communication Engineering (ECE) Exam  >  Electronics and Communication Engineering (ECE) Tests  >  Communication System  >  Test: Mutual Information - Electronics and Communication Engineering (ECE) MCQ

Test: Mutual Information - Electronics and Communication Engineering (ECE) MCQ


Test Description

8 Questions MCQ Test Communication System - Test: Mutual Information

Test: Mutual Information for Electronics and Communication Engineering (ECE) 2024 is part of Communication System preparation. The Test: Mutual Information questions and answers have been prepared according to the Electronics and Communication Engineering (ECE) exam syllabus.The Test: Mutual Information MCQs are made for Electronics and Communication Engineering (ECE) 2024 Exam. Find important definitions, questions, notes, meanings, examples, exercises, MCQs and online tests for Test: Mutual Information below.
Solutions of Test: Mutual Information questions in English are available as part of our Communication System for Electronics and Communication Engineering (ECE) & Test: Mutual Information solutions in Hindi for Communication System course. Download more important topics, notes, lectures and mock test series for Electronics and Communication Engineering (ECE) Exam by signing up for free. Attempt Test: Mutual Information | 8 questions in 30 minutes | Mock test for Electronics and Communication Engineering (ECE) preparation | Free important questions MCQ to study Communication System for Electronics and Communication Engineering (ECE) Exam | Download free PDF with solutions
*Multiple options can be correct
Test: Mutual Information - Question 1

Read the following expression regarding mutual information I(X;Y). Which of the following expressions is/are correct

Detailed Solution for Test: Mutual Information - Question 1

Mutual Information measures the amount of information that one Random variable contains about another Random variable.
The mutual information between two jointly distributed discrete Random variables X and Y is given by:

In terms of Entropy, this is written as:
I(X ; Y) = H(X) – H(X/Y)        ---(1)
(Option (b) is correct)
Also, the conditional entropy states that:
H(X, Y) = H(Y/X) + H(X)
H(X, Y) = H(X/Y) + H(Y)
From above equations we can write:
H(X/Y) = H(X, Y) – H(Y)        ---(2)
Using Equations (1) and (2), we can write:
I(X ; Y) = H(X) – [H(X, Y) – H(Y)]
I(X ; Y) = H(X) + H(Y) – H(X, Y)        ---(3)
(Option (c) is correct)

Test: Mutual Information - Question 2

Which of the following statements are correct?
(A) A given source will have maximum entropy if the produced are statistically independent
(B) As the bandwidth approaches infinity, the channel capacity becomes zero.
(C) For binary transmission the baud rate is always equal to bit rate
(D) The mutual information of a channel with independent input and output is constant
(E) Nat is a unit of information
Choose the correct answer from the options given below:
(1) (A) and (E) only
(2) (A) and (B) only
(3) (C) and (E) only
(4) (A), (D) and (E) only

Detailed Solution for Test: Mutual Information - Question 2

Statement (A):- A given source will have maximum entropy if messages produced are equally probable. So statement A given is wrong.
Statement (B):- From Shannon's channel capacity theorem:-

Statement B is wrong.
Statement (c):-
Band rate = 
For binary transmission M = 2
Band rate  
Band rate = Bitrate
Statement (C) is correct.
Statement (D):- it is False
Mutual information I (X, Y) = H (X) – H (X/Y)
= H (Y) – H (Y/X)
For binary symmetric channel
I (X, Y) is dependent on source
Probabilities it is not constant.
Statement (E):- It is correct.
Nat, bit, or Hartley is unit of Information,
Option (3) is correct.

1 Crore+ students have signed up on EduRev. Have you? Download the App
*Answer can only contain numeric values
Test: Mutual Information - Question 3

Let (X1, X2) be independent random varibales. X1 has mean 0 and variance 1, while X2 has mean 1 and variance 4. The mutual information I(X; X2) between X1 and X2 in bits is_______.


Detailed Solution for Test: Mutual Information - Question 3

Mutual information of two random variables is a measure to tell how much one random variable tells about the other.
It is mathematically defined as:
I(X1, X2) = H(X1) – H(X1/X2)
Application:
Since X1 and X2 are independent, we can write:
H(X1/X2) = H(X1)
I(X1,X2 ) = H(X1) – H(X1)
= 0

*Answer can only contain numeric values
Test: Mutual Information - Question 4

For the channel shown below if the source generates two symbols m0 and m1 with a probability of 0.6 and 0.4 respectively. The probability of error if the receiver uses MAP coding will be_______(correct up to two decimal places)


Detailed Solution for Test: Mutual Information - Question 4

If r0 is received:
P(m0) P(r/m0) = 0.6 × 0.6 = 0.36
Also,
P(m1) P(r0­/m1) = 0.4 × 0 = 0
If r1 is received:
P(m0) P (r1/m0) =  (0.6) (0.3)
= 0.18
P(m1­) P(r1/m1)
= (0.4) (0.7)
= 0.28
If r2 is received:
P(m0) P(r2/m0)
(0.6) (0.1)
= 0.06
P(m1) P(r2/m1)
= (0.4) (0.3)
= 0.12
Probability of correct detection
Pc = 0.12 + 0.28 + 0.36
= 0.76
Pe = 1 – 0.76
= 0.24

*Answer can only contain numeric values
Test: Mutual Information - Question 5

For the channel shown below, if the source generates Mand M1 symbols. The probability of error using ML decoding is


Detailed Solution for Test: Mutual Information - Question 5

If r0 is received
P (m0) P (r/ m0) = 0.5 × 0.5 = 0.30
P (m1) P (r/ m1) = 0.5 × 0 = 0
If r1­ is received
P (m1) P(r1/m0)
(0.5) (0.3)
= 0.15
P (m1) P(r1/m1)
(0.5) (0.7)
= 0.35
If r2 is received
P (m0) P(r2/m0)
= (0.5) (0.1)
= 0.05
P (m1) P(r2/m1)
(0.5) (0.3)
= 0.15
Probability of correct = 0.15 + 0.35 + 0.30
⇒ 0.80
Probability of error = 1 – 0.80
= 0.20 

*Answer can only contain numeric values
Test: Mutual Information - Question 6

Consider a Binary - channel
P(x1) = 0.5
P(x2) = 0.5

Find the mutual Information in bits/symbol


Detailed Solution for Test: Mutual Information - Question 6


P(y1) = 3/8
P(y2) = 5/8
Mutual Information I(xy)
I(xy) = H(x) - H(x/y)
= H(y) - H(y/x)
H(y) - Σ P(yj) log2 P(yi)
= - [0.375 log2(0.375) + 0.625 log2(0.625)]

I(xy) = 0.05 bits/symbol

*Answer can only contain numeric values
Test: Mutual Information - Question 7

In data communication using error detection code, as soon as an error is detected, an automatic request for retransmission (ARQ) enables retransmission of data. such binary erasure channel can be modeled as shown:

If P = 0.2 and both symbols are generated with equal probability. Then mutual information I(x, y) is _______.


Detailed Solution for Test: Mutual Information - Question 7


I (xy) = (1 - p) H (x)

= 0.4 [2 log 2] ( log used is base 2)
⇒ 0.8

*Answer can only contain numeric values
Test: Mutual Information - Question 8

A binary channel matrix is given by

Given, P(x1) = 1/3 and P(x2) = 2/3. The value of H(Y) is ________bit/symbol.


Detailed Solution for Test: Mutual Information - Question 8

The channel matrix can be represented as

14 videos|38 docs|30 tests
Information about Test: Mutual Information Page
In this test you can find the Exam questions for Test: Mutual Information solved & explained in the simplest way possible. Besides giving Questions and answers for Test: Mutual Information, EduRev gives you an ample number of Online tests for practice

Top Courses for Electronics and Communication Engineering (ECE)

14 videos|38 docs|30 tests
Download as PDF

Top Courses for Electronics and Communication Engineering (ECE)