From Wikipedia, the free encyclopedia.
Information theory is a branch of the mathematical theory of probability and mathematical statistics, that deals with the concepts of information and information entropy, communication systems, data transmission and rate distortion theory, cryptography, signal-to-noise ratios, data compression, error correction, and related topics. It is not to be confused with library and information science or information technology.
Claude E. Shannon (1916-2001) has been called "the father of information theory". His theory for the first time considered communication as a rigorously stated mathematical problem in statistics and gave communications engineers a way to determine the capacity of a communication channel in terms of the common currency of bits. The transmission part of the theory is not concerned with the meaning of the message conveyed, though the complementary wing of information theory concerns itself with content through lossy compression of messages subject to a fidelity criterion.
These two wings of information theory are joined together and mutually justified by the information transmission theorems, or source-channel separation theorems that justify the use of bits as the universal currency for information in many contexts.
It is generally accepted that the modern discipline of information theory began with the publication of Shannon's article "The Mathematical Theory of Communication" in the Bell System Technical Journal in July and October of 1948. This work drew on earlier publications by Harry Nyquist and Ralph Hartley. In the process of working out a theory of communications that could be applied by electrical engineers to design better telecommunications systems, Shannon defined a measure of entropy:
Entropy as defined by Shannon is closely related to entropy as defined by physicists. Boltzmann and Gibbs did considerable work on statistical thermodynamics. This work was the inspiration for adopting the term entropy in information theory. There are deep relationships between entropy in the thermodynamic and informational senses. For instance, Maxwell's demon needs information to reverse thermodynamic entropy and getting that information exactly balances out the thermodynamic gain that the demon would otherwise achieve.
See also James Tenney.
- Claude E. Shannon, Warren Weaver. The Mathematical Theory of Communication. Univ of Illinois Press, 1963. ISBN 0252725484
- Claude E. Shannon's original paper
- On-line textbook: Information Theory, Inference, and Learning Algorithms, by David MacKay - gives an entertaining and thorough introduction to Shannon theory, including state-of-the-art methods from communication theory, such as arithmetic coding, low-density parity-check codes, and Turbo codes.