SHANNON LIMIT FOR INFORMATION CAPACITY
The information capacity of communications system
represents the number of independent symbols that can be carried through the system in a
given unit of time.
The theoretical data-rate limit of a communication channel was devised
by Claude Shannon in the 1940's.
'' The maximum theoretical speed is dependent on the end to end
bandwidth of the system and on the signal-to-noise ratio of the channel being used."
The variable C is in bits per second
The S/N ratio is in dB
C = B log 2 (1+ S/N)
or
C =3.22 B log 10 (1+ S/N)
( Hartley : C = 3.32 log10 ( N ) ; N is the encoding levels per time
interval)
TRIVIA: In the 1940's during the world war, Shannon established five criteria for how
difficult it is for code-breakers to break an encryption system. He said it depends on:
1. The amount of secrecy offered
2. The size of encryption key
3. The simplicity of enciphering and deciphering operations
4. The propagation of errors
5. The extension( length of message.
(well it probably looked more profound at that time)
On the basis of a typical analog phone line with 3kHz bandwidth should be able to carry data rates to about 32 kbps while a really top analog line might just go to 38 kbps.
Ex. For a standard voice band communications channel with a signal to noise power ratio of 30 dB and a bandwidth of 2.7 kHz .What is the Shannon limit?
Digital Modulation
1. Frequency Shift Keying (FSK)