Please Enter Your Search Term Below:
 Websearch   Directory   Dictionary   FactBook 
  Wikipedia: Shannon capacity

Wikipedia: Shannon capacity
Shannon capacity
From Wikipedia, the free encyclopedia.

In 1949, Claude Elwood Shannon and Warren Weaver wrote The Mathematical Theory of Communication. (ISBN 0252725484) This founded the modern discipline of information theory. This discipline provides us quantitative tools to measure the amount of information in a message, as well as a way to quantify the fundamental information carrying capacity of a noisy channel.

The most famous formula to come from Shannon's work is for the capacity of an analog channel where the noise is modeled by a white Gaussian process: Shannon's law

C = bits per second - channel capacity,
W = frequency - bandwidth,
S/N = signal-to-noise ratio,
shows that there is a theoretical maximum amount of information that can be transmitted over a bandwidth-limited channel in the presence of background noise.

The signal-to-noise ratio above is in terms of the signal power per Hz as compared to the noise power per Hz. If the bandwidth (W) is increased while the physical power available for data transmission is held constant, then the signal power per Hz must decrease. It turns out that as the bandwidth goes to infinity, the signal-to-noise ratio goes to zero and the total capacity in the model above approaches a finite limit.


From Wikipedia, the free encyclopedia. 
Modified by Geona