Information Theory

L. Martignon, in International Encyclopedia of the Social & Behavioral Sciences, 2001

Information theory is the mathematical treatment of the concepts, parameters and rules governing the transmission of messages through communication systems. It was founded by Claude Shannon toward the middle of the twentieth century and has since then evolved into a vigorous branch of mathematics fostering the development of other scientific fields, such as statistics, biology, behavioral scienceneuroscience, and statistical mechanics. The techniques used in information theory are probabilistic in nature and some view information theory as a branch of probability theory. In a given set of possible events, the information of a message describing one of these events quantifies the symbols needed to encode the event in an optimal way. ‘Optimal’ means that the obtained code word will determine the event unambiguously, isolating it from all others in the set, and will have minimal length, that is, it will consist of a minimal number of symbols. Information theory also provides methodologies to separate real information from noise and to determine the channel capacity required for optimal transmission conditioned on the transmission rate.