Shannon's Theorem gives an upper bound to the capacity of a link, in bits per second (bps), as a function of the available bandwidth and the signal-to-noise ratio of the link.
The Theorem can be stated as:
C = B * log2(1+ S/N)
where C is the achievable channel capacity, B is the bandwidth of the line, S is the average signal power and N is the average noise power.
The signal-to-noise ratio (S/N) is usually expressed in decibels (dB) given by the formula:
10 * log10(S/N)
so for example a signal-to-noise ratio of 1000 is commonly expressed as
10 * log10(1000) = 30 dB.
Here is a graph showing the relationship between C/B and S/N (in dB):
Here are two examples of the use of Shannon's Theorem.
For a typical telephone line with a signal-to-noise ratio of 30dB and an audio bandwidth of 3kHz, we get a maximum data rate of:
C = 3000 * log2(1001)
which is a little less than 30 kbps.
Satellite TV Channel
For a satellite TV channel with a signal-to noise ratio of 20 dB and a video bandwidth of 10MHz, we get a maximum data rate of:
C=10000000 * log2(101)
which is about 66 Mbps
In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. By the noisychannel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.
Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.
It has several related meanings:
Bandwidth can also refer to: