Timeline of information theory
A timeline of events related to error correcting codes and related subjects.
- 1872 – Ludwig Boltzmann presents his H-theorem, and with it the formula Σpi log pi for the entropy of a single gas particle
- 1878 Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the whole system
- 1924 – Harry Nyquist discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system
- 1927 – John von Neumann defines the von Neumann entropy, extending the Gibbs entropy to quantum mechanics
- 1928 Hartley informationas the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning)
- 1929 Szilard enginecan sometimes transform information into the extraction of useful work
- 1940 deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismusprocess
- 1944 – Claude Shannon's theory of information is substantially complete
- 1947 Richard W. Hamming invents Hamming codesfor error detection and correction (to protect patent rights, the result is not published until 1950)
- 1948 Claude E. Shannon publishes A Mathematical Theory of Communication
- 1949 Shannon–Hartley law
- 1949 Claude E. Shannon's Communication Theory of Secrecy Systemsis declassified
- 1949 Robert M. Fano publishes Transmission of Information. M.I.T. Press, Cambridge, Massachusetts – Shannon–Fano coding
- 1949 prefix codes
- 1949 forward error correction
- 1951 – Solomon Kullback and Richard Leibler introduce the Kullback–Leibler divergence
- 1951 lossless data compression
- 1953 – August Albert Sardinas and George W. Patterson devise the Sardinas–Patterson algorithm, a procedure to decide whether a given variable-length code is uniquely decodable
- 1954 – Irving S. Reed and David E. Muller propose Reed–Muller codes
- 1955 – Peter Elias introduces convolutional codes
- 1957 – Eugene Prange first discusses cyclic codes
- 1959 Dwijendra Kumar Ray-Chaudhuri, discover BCH codes
- 1960 Reed–Solomon codes
- 1962 – Robert G. Gallager proposes low-density parity-check codes; they are unused for 30 years due to technical limitations
- 1965 concatenated codes
- 1966 – Fumitada Itakura (Nagoya University) and Shuzo Saito (Nippon Telegraph and Telephone) develop linear predictive coding (LPC), a form of speech coding[1]
- 1967 – Andrew Viterbi reveals the Viterbi algorithm, making decoding of convolutional codes practicable
- 1968 James L. Masseythe following year
- 1968 MML) statistical and inductive inference
- 1970 Valerii Denisovich Goppa introduces Goppa codes
- 1972 – Jørn Justesen proposes Justesen codes, an improvement of Reed–Solomon codes
- 1972 MPEG and MP3
- 1973 source coding[3]
- 1976 trellis modulation; a more detailed exposition in 1982 leads to a raising of analogue modem POTSspeeds from 9.6 kbit/s to 33.6 kbit/s
- 1976 Jorma J. Rissanen develop effective arithmetic codingtechniques
- 1977 LZ77)
- 1982 Valerii Denisovich Goppa introduces algebraic geometry codes
- 1989 DEFLATE(LZ77 + Huffman coding); later to become the most widely used archive container
- 1993 – Claude Berrou, Alain Glavieux and Punya Thitimajshima introduce Turbo codes
- 1994 – Michael Burrows and David Wheeler publish the Burrows–Wheeler transform, later to find use in bzip2
- 1995 – Benjamin Schumacher coins the term qubit and proves the quantum noiseless coding theorem
- 2003 – David J. C. MacKay shows the connection between information theory, inference and machine learning in his book.
- 2006
- 2008 – Erdal Arıkan introduces polar codes, the first practical construction of codes that achieves capacity for a wide array of channels
References
- ISSN 1932-8346.
- Nasir Ahmed. "How I Came Up With the Discrete Cosine Transform". Digital Signal Processing, Vol. 1, Iss. 1, 1991, pp. 4-5.
- ISSN 0018-9448.