What do the following have in common? Digital Communication, Juggling Machines, Mechanical maze-solving mice, Motorized Pogo sticks, and a machine that solves the Rubik’s cube puzzle? Well the answer is the great American mathematician and Engineer Claude Shannon, who has invented all of this.
Before introducing the subject to the readers let me portray the common senses in our daily lives.Scene 1: Pickup your favorite music or video CD and let you scratch it with some rough surface. Now slide it into the slot in the CD player and listen to your favorite music or watch you favorite movie. Contrary to your expectations you will notice to your astonishment that the music and the film come out almost as crystal clear as it was before the disc was scratched.
Scene 2: The man in the street (aam aadmi, the chaiwala, the rikshawala, the doodhwala) using mobile phones in an innovatively created advertisement of a mobile service provider, which exemplifies the ever increasing number of mobile users even among the lower strata of society. The above two scenes are not hypothetical; they are now a reality. They have resulted from the growth of information and communication technology. These two technologies have made immense impact on human society.
Before moving on with the rest of your day, spare a moment to think about the man whose revolutionary ideas made both the above scenes, once seemingly improbable, a reality now. The man was Claude Elwood Shannon. The CD player is able to reproduce the music or movie without any distortion from a scratched CD largely due to the incorporation of unique error-correcting codes in the CD. Shannon was one of the all-time great scientist-mathematicians in the mid twentieth century. The revolutionary developments in modern communication technology also owe its debt to Shannon, for it was he who introduced the concept of digital communication technology.
Foundation for Digital Technology
Claude Elwood Shannon, distant relative of the legendary Thomas Alva Edison, is regarded as the father of modern digital communication and information theory. He was one of the most outstanding scientists of the 20th century. His work in the 1940s served as the foundation for digital communication technology. He used his mathematical genius and his engineering skills to combine mathematical theories with engineering principles to set the stage for the development of the digital computer and modern digital communication revolution. He was the first person to realize that any message, be it the voice, text, movies, pictures, data etc., could be transmitted as a series of 0’s and 1’s on the communication channel. The information content of a message, he theorized, consists simply of the number of 1’s and 0’s it takes to transmit it. Shannon, in his Master’s thesis, showed that, these binary digits (0’s and 1’s) could be represented by two basic conditions of the electrical switches. A switch that was turned ON represented the digit 1 and switch that was turned OFF represented the digit 0. He used the Boolean algebra to show that complex operations could be performed automatically on these electrical circuits, thus manipulating the data they were storing.
Although today we use this mathematics all the time in designing digital circuits that form the basis of modern computers and telecommunication systems, the adoption of Shannon’s concept in digital communication was not easily forthcoming. To most communications engineers of his time, signals were quite obviously ‘analog’ and the concept of digital signals was yet to sink in. With the invention of the transistor at Bell Labs in the late 1940s, things began to change. The invention of the transistor, coupled with Shannon’s remarkable theorems telling communications engineers what ultimate goals to strive for, and the integrated circuits providing ever improving hardware to realize these goals, the incredible digital communications revolution began to take shape. Electronics and communications engineers gradually adopted Shannon’s revolutionary ideas to stimulate the digital communication technology, which has led to today’s information age.
Adopting the bit
Shannon liberated the “entropy” of thermodynamics from physics and redefined it as a measure of uncertainty on probability distributions. Shannon used the term ‘bit’ (binary digit), which is the most common word used in modern days. Shannon used the term for the first time in one of his papers, “A Mathematical Theory of Communication”, which was published in 1948. Shannon defines his ‘bit’ as amount of information gained (or entropy removed) upon learning the answer to a question for which two possible answers were equally likely. In fact, the framework and terminology that he used for information theory remains standard even today. All communication lines today are measured in ‘bps’ (bits per second), the notion that Shannon made precise with the “channel capacity”. His theory also made it possible to use bits in computer storage needed for pictures, voice streams and other data. “Nobody has come close to his idea before,” said Massachusetts Institute of Technology (MIT) Professor Emeritus Robert G. Gallager, who worked with Shannon. “This was not something somebody else would have done for a very long time,” he added.
Starting in 1938, Shannon worked at MIT with Vannevar Bush’s “differential analyzer”, the ancestral analog computer. After another summer (1940) at Bell Labs, he spent the academic year working under the famous mathematician Hermann Weyl at the Institute of Advanced Study in Princeton, where he also began thinking about recasting communications on a proper mathematical foundation. Shannon was also a noted cryptographer, during World War II, he worked on Secrecy Systems at Bell labs. His team’s work on anti-aircraft directors -devices that observe enemy planes or missiles and calculate the trajectory of counter missiles, became crucial when German rockets were used in the bombardment of England. His mastery contributions to the field of cryptography can best be seen in his 1949 paper entitled “Communication Theory of Secrecy Systems”. Shannon was affiliated with Bell Laboratories from 1941 to 1956. During his 15 years stay at the Bell Labs, he initially worked on projects related to the war effort. In 1945 he wrote a classified report, “A Mathematical Theory of Cryptography”, which was finally declassified and published in 1949 in the Bell System Technical Journal as the “Communication Theory of Secrecy Systems”.
Error correcting codes
To appreciate the importance of Shannon’s works, one has to look at the technology, which is used in the transmission and reception of data from interplanetary probes in space. Modern communications technology allows satellites and space probes to transmit pictures and data millions of kilometers back to earth from space without any significant loss in data, with antennae that uses very little power. The principle method employed in transmission of signals, data and pictures through the noisy vacuum of space to earth uses some intriguing codes that are known as ‘error-correcting codes’. These codes contain internal bits of binary data that allow the receiver to determine whether the data that has been received is exactly same as that transmitted or not, and if the data received is not accurate, then ideally to reconstruct the correct version of the data at the receiver end.The concept of introducing such useful error-correcting codes go back to 1948, the year when Shannon published a paper, which showed the maximum theoretical rate at which information can be transmitted without error. Even since Shannon’s path-breaking 1948 paper, several expert coding theorists have struggled to find codes that perform as well as that of Shannon’s codes. The stored information in computers on CDs and DVDs are also protected with the same types of error-correcting codes which are used in the transmission of information. Shannon also pioneered the study of ‘source coding’ or ‘data compression’, an important method of modern communications.Shannon left Bell Labs to join MIT in 1958 and worked as a Donner Professor of Science until his retirement in 1978. For decades, MIT continued to be the leading University for information and communication theory. In recognition of his pioneering contribution to the field of information and communication theory, the information theory group of the Institute of Radio Engineers (IRE), which later becomes the information theory society of the Institute of Electrical and Electronics Engineers (IEEE), established the Shannon Award as its highest honor.
Shannon was also a great inventor, a trait that he inherited from his grandfather, designed and built several interesting machines, which includes the chess-playing, maze-solving, juggling and mind reading machines. Shannon also designed and developed several other machines like the juggling machine, motorized Pogo sticks, a mind-reading machine, a mechanical mouse that could navigate a maze and also a device that could solve the famous Rubik’s Cube puzzle. Shannon was an avid chess player.
Medals and Honours
Despite Shannon’s immense contributions to society he did not receive the coveted Nobel Prize, as most of his contributions were in the fields that do not qualify for the Nobel Prize. But he received many awards like the Morris Liebmann Memorial award in 1949, Ballantine Medal in 1955, the Merin J. Kelly award of the American Institute of Electrical Engineers in 1962, the U.S. National Medal of Science in 1966, IEEE Medal of Honour in 1966, the Israel’s Harvey Prize in 1972, the Jaquard Award in 1978, the John Fritz Medal in 1983 and Japan’s Kyoto Prize in 1985. He was a member of the American Philosophical Society, the National Academy of Sciences, the Royal Society of London and the Leopoldina Academy.During the last few years of his life, Shannon was afflicted with Alzheimer’s disease and he died on 24 February 2001, at Medford, Massachusetts. Author is working at IGNOU, Regional Centre, Srinagar as Information Technology Consultant.