View source for Claude Shannon
Jump to:
navigation
,
search
__NOTOC__ Claude Shannon, an American electrical engineer and mathematician, is considered the founder of information theory. ==Shannon's contribution to the design of digital circuits== Shannon graduated from the University of Michigan in 1936 with two bachelor's degrees, one in electrical engineering and one in mathematics. He began graduate study at the Massachusetts Institute of Technology, where he worked on [[Vannevar Bush]]'s differential analyzer, an analog computer. While studying the circuits of the differential analyzer, Shannon realized that Boolean algebra and binary arithmetic could be used to simplify the arrangement of the electromechanical relays. He then turned the concept upside down and proved that arrangements of relays could be used to solve Boolean algebra problems. This use of relays, which was set out in Shannon's 1937 master's thesis, A Symbolic Analysis of Relay and Switching Circuits, is the basic concept that underlies all electronic digital computers. It has been claimed that this was the most important master's thesis of all time. A paper drawn from Shannon's 1937 master's thesis was published in the 1938 issue of the Transactions of the American Institute of Electrical Engineers. Shannon's work became the foundation of all practical digital circuit design when it became widely known among the electrical engineering community during and after World War II. Shannon's theoretical rigor completely replaced the ad hoc methods that had previously prevailed. [[Vannevar Bush]] suggested that Shannon carry out work for his doctoral dissertation at Cold Spring Harbor Laboratory, to develop similar mathematical relationships for Mendelian genetics. This resulted in Shannon's 1940 PhD thesis at MIT, An Algebra for Theoretical Genetics. ==Shannon's invention of information theory== During World War II Shannon joined Bell Labs to work on fire-control systems and cryptography, under a contract with the National Defense Research Committee (NDRC). In 1945 the NDRC published a volume on fire control containing an essay titled Data Smoothing and Prediction in Fire-Control Systems, coauthored by Ralph Beebe Blackman, Hendrik Wade Bode, and Claude Shannon. This article formally introduced the problem of fire control as a special case of the transmission, manipulation and utilization of intelligence, formulating the problem in terms of signal processing and thus heralding the coming of the information age. In 1948 Shannon published the two part article, A Mathematical Theory of Communication, in the Bell System Technical Journal. This work focuses on the problem of how best to encode the information a sender wants to transmit, using probability theory developed by [[Norbert Wiener]]. Shannon used information entropy as a measure for the uncertainty in a message, essentially inventing the field of information theory. Shannon's 1948 article was reprinted in a book, The Mathematical Theory of Communication, together with a popularization of the article by co-author Warren Weaver. Shannon's concepts were also popularized in John Robinson Pierce's Symbols, Signals, and Noise, which Shannon proofread. ==Shannon's subsequent activities== Shannon published in 1949 a paper on Communication Theory of Secrecy Systems, a major contribution to the mathematical theory of cryptography, where he proved that all theoretically unbreakable ciphers must have the same requirements as the one-time pad. He is also credited with the introduction of the theory of sampling, by which an analog signal is represented using a discrete set of samples. This theory enabled telecommunications to move from analog to digital transmissions systems in the 1960s and later. In 1950 Shannon created Theseus, a magnetic mouse controlled by a relay circuit that enabled it to move around a maze of 25 squares. The mouse was designed to search through the corridors until it found the target. Having travelled through the maze, the mouse would then be placed anywhere it had been before and because of its prior experience it could go directly to the target. Shannon's mouse appears to have been the first learning device of its kind. In 1950 Shannon published a groundbreaking paper on computer chess entitled Programming a Computer for Playing Chess. It describes how a machine or computer could be made to play a reasonable game of chess, using a minimax procedure. Shannon extended information theory to natural language processing and computational linguistics in his 1951 article "Prediction and Entropy of Printed English", which demonstrates that treating white space as the 27th letter of the alphabet actually lowers uncertainty in written language. Shannon and his wife Betty also went on weekend forays to Las Vegas with M.I.T. mathematician Edward Thorp, making a fortune in roulette and blackjack using game theory methods co-developed with fellow Bell Labs associate physicist John L. Kelly Jr. Shannon and Thorp invented what is considered the first wearable computer, which they used when playing roulette. Shannon and Thorp also applied the same theory, later known as the Kelly criterion, to the stock market with even better results. ==Links== *http://en.wikipedia.org/wiki/Claude_Shannon [[Category:Post-War Cybernetics]]
Return to
Claude Shannon
.
Navigation menu
Personal tools
Log in
Namespaces
Article
Discussion
Variants
Views
Read
View source
View history
Actions
Search
Navigation
Main Page
Community portal
Current events
Recent changes
Random page
Help
Donations
Toolbox
What links here
Related changes
Special pages
Page information