The origins of modern information technology – On the 100th birthday of Claude Shannon
One feature that is appreciated by only very few people, is conceitedness. Who thinks too highly of him- or herself is considered arrogant and detached. And quite often arrogance comes in combination with ignorance, which then makes a particularly unpleasant appearance. The German language refers to this feature as “Einbildung”.In contrast the Latin version of this word is today regarded as the measure of objective knowledge: “information”. We obtain it, process it, transfer it, and store it. Information forms the core of the digital world, and information technologies shape an essential part of our modern lives. Our era is thus also referred to as the “age of information”. Some physicists consider information as a fundamental property of nature (alongside energy and matter). The famous physicist John Wheeler therefore coined the catchy phrase “it from bit” (everything comes information). In a subtle way information is linked to physical energy. Deleting or transferring information necessarily requires energy (or increases the entropy), which is why even amicroscopic intelligence that can process information (a so-called “Maxwell demon”) cannot violate the second law of thermodynamics.
Without much public attention we are these days celebrating the 100th birthday of the father of modern information theory, to whom we owe the insight that information can be described mathematically and physically which ultimately creates the basis of all information technology. Claude Shannon was born on 30 April 1916 in Petoskey, a town on the eastern shore of Lake Michigan (he died on 24 February 2001). In 1948 he published the 50-page essay “A Mathematical Theory of Communication” which is regarded the Bible of the information age today and constitutes one of the most important and influential scientific works of the 20th century. By quantifying the content of information Shannon’s work provided information theory with a profound mathematical foundation, thereby exempting it from any specific content or semantic aspect. Rarely ever had a mathematical work such a fast and significant practical relevance.
His essay specifically dealt with the problem, under which conditions information encoded by a transmitter and transmitted by a real (i.e. perturbed) communication channel can be restored, i.e. decoded, without loss of information. In other, more concrete words, how can information be encoded such that it can be safely transported by radio waves over long distances? Information is necessarily connected to a material base (which shall include electromagnetic waves), and Shannon realized, that it can generally not be transferred without some loss(or at best staying the same).The greater the choices of the sender, the greater is the uncertainty on the part of the recipient, and the greater is the information content of the transmitted message (and the potential for loss in transfer). Shannon formulas allowed to calculate what amount of information can maximally be transmitted over a specific channel. They also showed that it is possible to protect the information transmission from interference by adding redundancy (check code). Shannon succeeded in describing the informational content of a message in a closed-end mathematical formula.Information thereby refers to the probability of occurrence for certain series of elements (e.g. a series of letters) out of a defined set (e.g. the alphabet). In all this Shannon realized that the thusdefined information possesses a surprising structural proximity to the well-known concept of entropy from physics. And as entropy always increases in thermodynamically closed systems, information will therein eventually equally be destroyed. Information is thus a kind of physical quantity.
Shannon provided the tools to systematically assess,process, and transmit data and information. And to what other purpose were computers made? It was Shannon’s work that allowed to build the first modems for data transmission over telephone lines in the late 1960s, precursors of computer networks that have been developed by the US military institution DARPA, which ultimately spawned the Internet. Shannon also laid out the formal foundations of cryptography and lifted it to the ranks of an independent science. Its importance for today’s Internet communication can hardly be overestimated. Who else but Claude Shannon may therefore be called the “grandfather of the Internet”?
Shannon’s work was quickly understood and appreciated not only by mathematicians but also by radio technologists, biologists, psychologists, doctors, linguists, and other scientists. His information theory was considered the unification of linguistics, life and physical sciences, and even obtained broader public attention at last (what gave Shannon appearances in TV shows). To his honor, the information content of a message is measured in units of “Shannon” today.
Shannon was interested in many things and displayed a great deal of creativity. Next to his mathematical work he built all kinds of things such as juggling machines, rocket-powered Frisbees, unicycles with an eccentric axis, a very first chess computer, and finally an “Ultimate Machine”, a small box with a single switch that if turned onopens a lid, lets a hand come out, which turns the switch back to “Off”. Shannon is told to have ridden on a unicycle through the office corridors while juggling. In all his creativity, his never ending curiosity, his sense of humor, and his mathematical brilliance the father of the modern information theory was likely anything but “conceited” – or subject to “Einbildung”.