From Wikipedia, the free encyclopedia.
In 1949, Claude Elwood Shannon and Warren Weaver wrote The Mathematical Theory of Communication. (ISBN 0252725484) This founded the modern discipline of information theory. This discipline provides us quantitative tools to measure the amount of information in a message, as well as a way to quantify the fundamental information carrying capacity of a noisy channel.
The most famous formula to come from Shannon's work is for the capacity of an analog channel where the noise is modeled by a white Gaussian process: Shannon's law
The signal-to-noise ratio above is in terms of the signal power per Hz as compared to the noise power per Hz. If the bandwidth (W) is increased while the physical power available for data transmission is held constant, then the signal power per Hz must decrease. It turns out that as the bandwidth goes to infinity, the signal-to-noise ratio goes to zero and the total capacity in the model above approaches a finite limit.