Previous | Contents | Next | ||
Issues in Science and Technology Librarianship |
Spring 2011 | |||
DOI: 10.5062/F4D798BF |
James Gleick. The Information: A History, a Theory, a Flood. New York: Pantheon, 2011.
What do electrical circuits have in common with African talking drums and the human genome? All of these -- plus the alphabet, logarithms, vacuum tubes, quantum particles, Morse code, and more -- are carriers of information. James Gleick's latest book thoroughly explores this idea within a history of the development of information theory; the Prologue includes the observation that "information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. …Now even biology has become an information science, a subject of messages, instructions, and code."
The 400+ page tome begins with a chapter on African "talking drums," a complex and poetic long-distance communication method that for 19th-century Europeans was, like many of the other subjects addressed in this work, not immediately recognizable as an information carrier. The narrative continues with chapters about early dictionaries and moves into what the casual reader would most likely think of as the beginnings of "information technology": the telegraph and Morse code, the "difference engine" of Charles Babbage, the "differential analyzer" of Vannevar Bush, Alan Turing's Universal Machine, and the invention of binary programming. The "bit," and later the "byte," enter the scene, allowing what was previously unquantifiable to be measured. As forecast by Gleick's opening pronouncement, the subject of genetics gets a chapter in the context of information's physical nature, as do quantum physics, memes, and finally, the engine of information overload that is the World Wide Web.
Although not a biography, The Information focuses on a protagonist critical to information theory: Claude Shannon, mathematician, engineer, WWII codebreaker, Bell Labs employee, and author of the first publication with "information theory" in the title. Working at a time of many advances in the field, Shannon serves as a human representation of the information theory he helped to develop. Other key figures such as Babbage and Turing, as well as Ada Lovelace, Richard Dawkins, and Stephen Hawking flesh out the narrative, until the final chapters, in which the actors must be represented by Wikipedia editor code-names or trademarked web URLs.
Like such recent works as Bill Bryson's A Short History of Nearly Everything or any of Simon Winchester's books, The Information is written in a clear, uncomplicated style that doesn't dumb down or flinch at the complexity of its subject matter. My only complaint about this book is the lack of endnote markers in the text. Presumably meant to facilitate a smoother reading experience, this style choice makes it difficult to reference the intriguing source information provided at the end. However, this is a small quibble. This book would particularly appeal to someone working in a field related to information technology, but any reader who lives in an information-saturated world with an interest in history will find it useful. I'd recommend it for any academic and most public or special libraries; it might also be appropriate for high school libraries, especially those supporting advanced classes in science and math.
Previous | Contents | Next |