Electronics and Communication Engineering (ECE) Exam  >  Electronics and Communication Engineering (ECE) Tests  >  Test: Channel Capacity - Electronics and Communication Engineering (ECE) MCQ

Test: Channel Capacity - Electronics and Communication Engineering (ECE) MCQ


Test Description

10 Questions MCQ Test - Test: Channel Capacity

Test: Channel Capacity for Electronics and Communication Engineering (ECE) 2024 is part of Electronics and Communication Engineering (ECE) preparation. The Test: Channel Capacity questions and answers have been prepared according to the Electronics and Communication Engineering (ECE) exam syllabus.The Test: Channel Capacity MCQs are made for Electronics and Communication Engineering (ECE) 2024 Exam. Find important definitions, questions, notes, meanings, examples, exercises, MCQs and online tests for Test: Channel Capacity below.
Solutions of Test: Channel Capacity questions in English are available as part of our course for Electronics and Communication Engineering (ECE) & Test: Channel Capacity solutions in Hindi for Electronics and Communication Engineering (ECE) course. Download more important topics, notes, lectures and mock test series for Electronics and Communication Engineering (ECE) Exam by signing up for free. Attempt Test: Channel Capacity | 10 questions in 30 minutes | Mock test for Electronics and Communication Engineering (ECE) preparation | Free important questions MCQ to study for Electronics and Communication Engineering (ECE) Exam | Download free PDF with solutions
Test: Channel Capacity - Question 1

Match List I with List II:

Choose the correct answer from the options given below:

Detailed Solution for Test: Channel Capacity - Question 1

It states the channel capacity C, i.e. the theoretical highest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel that is subject to additive white Gaussian noise (AWGN) of power N.
Mathematically, it is defined as:

C = Channel capacity
B = Bandwidth of the channel
S = Signal power
N = Noise power
Bayes’ theorem
It states that the conditional probability of an event, based on the occurrence of another event, is equal to the likelihood of the second event given the first event multiplied by the probability of the first event.
Parseval's Theorem:
For continuous-time, periodic signal, the energy is given by:

Where ak is the Fourier series coefficient of x(t), and T is the period of the signal.
For average power in one period of the periodic signal x(t), we write:

∴ |ak|2 is the average power in the kth harmonic of x(t).
∴ Parseval's relation states that the total average power in a periodic signal equals the sum of the average powers in all of its harmonic components.

Test: Channel Capacity - Question 2

The information capacity (bits/sec) of a channel with bandwidth C and transmission time T is given by 

Detailed Solution for Test: Channel Capacity - Question 2

Shannon–Hartley theorem: 
It states the channel capacity C, i.e. the theoretical highest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel that is subject to additive white Gaussian noise (AWGN) of power N.
Mathematically, it is defined as:

C = Channel capacity
B = Bandwidth of the channel
S = Signal power
N = Noise power

1 Crore+ students have signed up on EduRev. Have you? Download the App
Test: Channel Capacity - Question 3

Two binary channels have been connected in a cascade as shown in the figure.

It is given that P(x1) = 0.7 and P(x2) = 0.3, then choose correct option from below. 

Detailed Solution for Test: Channel Capacity - Question 3

For a binary channel as shown below

P(yn) = P(xm)P(yn/xm) , n = 1,2 and m = 1,2
Where,
P(yn) → Probability of receiving output yn
P(xm) → Probability of the transmitting signal
P(yn/xm) → Probability of the received output provided x was transmitted
P(xm/yn) = [P(yn/xm)× P(xm)]/P(yn)
Where;
P(xm/yn) → Probability of xm transmitted given that yn was received
Calculation of P(z1);

The marked line shows the path from x1 to z1 (abc and aec) and x2 to z1 (dec and dbc)
Following these paths, we can calculate P(z1)
P(z1) = [P(x1) {(0.6× 0.4) or(0.4 × 0.7}] or [P(x2){(0.3 × 0.4) or (0.7 × 0.7)}]
⇒   P(z1) = [0.7 {(0.6× 0.4) + (0.4 × 0.7}] + [0.3{(0.3 × 0.4) + (0.7 × 0.7)}]
∴ P(z1) = 0.547
Similarly, P(z2) can be calculated;
P(z2) = [P(x1) {(0.6× 0.6) +(0.4 × 0.3}] + [P(x2){(0.3 × 0.6) + (0.7 × 0.3)}] = 0.453
Calculation of P(x2/y1):

P(y1) = (0.7 × 0.6) + (0.3 × 0.3) = 0.51

Test: Channel Capacity - Question 4

The maximum rate at which nearly error-free data can be theoretically transmitted over a communication channel is defined as

Detailed Solution for Test: Channel Capacity - Question 4
  • Channel capacity is the maximum rate at which the data can be transmitted through a channel without errors.
  • The capacity of a channel can be increased by increasing channel bandwidth as well as by increasing signal to noise ratio.
  • Channel capacity (C) is given as,

Where,
B: Bandwidth
S/N: Signal to noise ratio
Entropy:
The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.
It is calculated as:

pi is the probability of the occurrence of a symbol.

Test: Channel Capacity - Question 5

The Shannon limit for information capacity I is

Where:
N = Noise power (W)
B = Bandwidth (Hz)
S = Signal power (W) 

Detailed Solution for Test: Channel Capacity - Question 5

Shannon–Hartley theorem
It states the channel capacity C, meaning the theoretical highest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel that is subject to additive white Gaussian noise (AWGN) of power N.
Mathematically, it is defined as:

C = Channel capacity
B = Bandwidth of the channel
S = Signal power
N = Noise power
∴ It is a measure of capacity on a channel. And it is impossible to transmit information at a faster rate without error.

Test: Channel Capacity - Question 6

What is the capacity of an additive white Gaussian noise channel with bandwidth of 1 MHz, power of 10W and noise power spectral density of No/2 = 10(−9) W/Hz?

Detailed Solution for Test: Channel Capacity - Question 6

Additive white Gaussian noise (AWGN) is a basic noise model used in information theory to mimic the effect of many random processes that occur in nature.
The modifiers denote specific characteristics: Additive because it is added to any noise that might be intrinsic to the information system.
The capacity of an additive white Gaussian noise channel by Shanon's formula:
 
Where B refers to the bandwidth of the channel
SNR means Signal to Noise Ratio : can be defined as the ratio of relevant to irrelevant information in an interface or communication channel.
SNR now can be calculated as,

In the above problem, S is the signal power which is equivalent to 10 W.
Bandwidth = 1 MHz
Noise power spectral density of No/2 = 10(−9) W/Hz

Test: Channel Capacity - Question 7

The Shannon’s Theorem sets limit on the

Detailed Solution for Test: Channel Capacity - Question 7

Shannon–Hartley theorem: 
It states the channel capacity C, i.e. the theoretical highest upper bound on the information rate of data that can be communicated at an arbitrarily low error rate using an average received signal power S through an analog communication channel that is subject to additive white Gaussian noise (AWGN) of power N.
Mathematically, it is defined as:

C = Channel capacity
B = Bandwidth of the channel
S = Signal power
N = Noise power
∴ It is a measure of capacity on a channel. And it is impossible to transmit information at a faster rate without error.
So that Shannon’s Theorem sets limit on the maximum capacity of a channel with a given noise level.
The Shannon–Hartley theorem establishes that the channel capacity for a finite-bandwidth continuous-time channel is subject to Gaussian noise.
It connects Hartley's result with Shannon's channel capacity theorem in a form that is equivalent to specifying the M in Hartley's line rate formula in terms of a signal-to-noise ratio, but achieving reliability through error-correction coding rather than through reliably distinguishable pulse levels.
Bandwidth and noise affect the rate at which information can be transmitted over an analog channel.
Bandwidth limitations alone do not impose a cap on the maximum information rate because it is still possible for the signal to take on an indefinitely large number of different voltage levels on each symbol pulse, with each slightly different level being assigned a different meaning or bit sequence.
Taking into account both noise and bandwidth limitations, however, there is a limit to the amount of information that can be transferred by a signal of a bounded power, even when sophisticated multi-level encoding techniques are used.
In the channel considered by the Shannon–Hartley theorem, noise and signal are combined by addition. That is, the receiver measures a signal that is equal to the sum of the signal encoding the desired information and a continuous random variable that represents the noise.

Test: Channel Capacity - Question 8

Channel capacity is a measure of -

Detailed Solution for Test: Channel Capacity - Question 8

Channel capacity:

  • Channel capacity is the maximum rate at which the data can be transmitted through a channel without errors.
  • The capacity of a channel can be increased by increasing channel bandwidth as well as by increasing signal to noise ratio.
  • Channel capacity (C) is given as, 

Where,
B: Bandwidth
S/N: Signal to noise ratio

Entropy:
The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.
It is calculated as:

pis the probability of the occurrence of a symbol.

Test: Channel Capacity - Question 9

In the communication system, if for a given rate of information transmission requires channel bandwidth, B1 and signal-to-noise ratio SNR1. If the channel bandwidth is doubled for same rate of information then a new signal-to-noise ratio will be

Detailed Solution for Test: Channel Capacity - Question 9

Shannon-Hartley Theorem- It tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise.

Where C is the channel capacity in bits per second
B is the bandwidth of the channel in hertz
S is the average received signal power over the bandwidth
N is the average noise
S/N is the signal-to-noise ratio (SNR)
For transmitting data without error R ≤ C where R = information rate
Assume information Rate = R
Form Shanon hartley theorem
Rmax = C

When channel Band width (B) = B1
Signal to noise ratio = SNR1

Test: Channel Capacity - Question 10

An Ideal power limited communication channel with additive white Gaussian noise is having 4 kHz band width and Signal to Noise ratio of 255. The channel capacity is:

Detailed Solution for Test: Channel Capacity - Question 10

Shannon’s channel capacity is the maximum bits that can be transferred error-free. Mathematically, this is defined as:

B = Bandwidth of the channel
S/N =Signal to noise ratio
Note: In the expression of channel capacity, S/N is not in dB.
Calculation:
Given B = 4 kHz and SNR = 255
Channel capacity will be:

C = 32 kbits/sec

Information about Test: Channel Capacity Page
In this test you can find the Exam questions for Test: Channel Capacity solved & explained in the simplest way possible. Besides giving Questions and answers for Test: Channel Capacity, EduRev gives you an ample number of Online tests for practice

Top Courses for Electronics and Communication Engineering (ECE)

Download as PDF

Top Courses for Electronics and Communication Engineering (ECE)