By Josef Eisinger©
After musing in these pages about the evolution of life on Earth (WestView News, Feb. 2019) and about the rapid pace of technological innovations (WestView News, Jan. 2018), I propose to select in this article five technological landmarks that arguably had the greatest influence on human behavior in their wake. This is, admittedly, a somewhat foolhardy and speculative endeavor, since technological innovations are rarely solitary events but are themselves the culmination of skills acquired previously by many individuals, and also because shifts in human behavior may well have more than one cause. Undeterred, I hope all the same that few will quarrel with the selections I made.
Changes in our ancestors’ genetic make-up occurred, of course, over many generations, often over millions of years, while the consequences of new technologies can manifest themselves very rapidly, particularly in regard to the most recent technological innovations. They can alter human behavior and perturb a society’s equilibrium very quickly, as witnessed the role played by social media in recent politics.
We know from the genetic analysis of different species that humanity’s most recent evolutionary milestone occurred approximately six million years ago, when our ancestors diverged from our closest relatives, the chimpanzees. We also know from humanoid fossil finds that this divergence took place over a few million years, in the course of which our forebears underwent a number of anatomical modifications that facilitated their walking upright and endowed them with lithe hands and large braincases. These modifications may well be related to the earliest technological advances made by humanoids: their controlled use of fire and tool-making. We have learned from the sparse archeological evidence that humanoids used campfires and fashioned cooking vessels and stone tools about one million years ago. We do not know when speech, another important skill, was invented, but one may speculate that it had its beginnings at about that time.
The control over fire can rightly be considered humanity’s first technological milestone, since it affected Stone Age hominid societies profoundly: Fire allowed them to settle in regions with colder climates; it afforded them light at night, as well as heat; and it bestowed on them the social, nutritional, and culinary benefits of shared cooked food. Fire-hardened wooden weapons made their hunting more efficient, while their cave paintings and sculptured figurines from 30,000 to 40,000 years ago bear witness to their artistic instinct. Archeologists also tell us that about 8,000 years ago—that is, following the agricultural revolution that made larger populations possible—a city of mud brick houses was home to several thousand inhabitants in Çatalhöyük, Anatolia, as well as elsewhere in the Near East.
Not much later (about 5,000 BCE), humanity reached its second technological milestone: the bellows-driven, forced-air furnace, which attained temperatures high enough to smelt metal ores. Lead was probably the first metal to be extracted from its mineral, galena (PbS, melting point 327 °C), and there is good evidence that at about the same time, copper was smelted in Eastern Serbia and elsewhere. The metal genie was now out of the bottle; the discovery that bronze, a copper-tin alloy, was much harder than copper, ushered in the Bronze Age, which was soon followed by the Iron Age. Although the extraction of iron requires higher temperatures (1,250 °C) and more complicated procedures than those used to extract lead or copper, cast and wrought iron tools and weapons were in widespread use all over the Mediterranean region and the Near East by 1,000 BCE.
We have now reached the Classical Age, in which literature and written historical records had their beginnings. Technologically, however, it was a fairly static era, as can be seen from the tools, weapons, and machinery of the Middle Ages, which differed little from those used in ancient Rome. Burgeoning human society, on the other hand, continued to evolve, and spawned the skills and the conditions that produced the third technological milestone: Johannes Gutenberg’s metal, movable-type printing press. The press he designed (about 1,450 CE) reduced the cost of printed books so dramatically that books became widely available for the first time. Just 50 years after Gutenberg’s printing press, there were publishing enterprises in hundreds of cities and the books they churned out profoundly affected public attitudes towards religion and politics. Books were equally important in spreading music, the law, and literature among the ever more literate population. It is, indeed, generally accepted that printed books and periodicals played a leading role in initiating the Age of Enlightenment and, with the advent of steam power and railroads, the Industrial Revolution.
Medicine, which had wallowed in ignorance and superstition for millennia, was finally put on a more rational basis thanks to the proliferation of medical books and journals beginning in the 17th century. This was also the case in science, and the first scientific journals, Le Journal des Sçavans and Philosophical Transactions of the Royal Society, were published in 1665. They were soon joined by many other journals that became the conduit of scientific dialogue. Modern, experiment-based science flourished in the 18th and 19th centuries, and since it is this article’s mission to identify events that had sweeping societal consequences, I chose Maxwell’s Equations of Electromagnetism as the fourth milestone.
Magnetic minerals (lodestones) and static electricity had been known since antiquity and had puzzled curious investigators ever since. It was not until the 19th century, however, that it was discovered that electricity and magnetism are intimately connected, and that light is an electromagnetic wave. James Clerk Maxwell used the insight gained from experiments conducted by Michael Faraday, Hermann Helmholtz, and others to formulate the laws of electromagnetism mathematically (1862). The electromagnetic force described by his equations was only the second natural force to be discovered, and it followed Newton’s mathematical description of gravitation by 150 years. (Note that Albert Einstein devoted the second half of his life to searching for a connection between these two force fields.)
Maxwell’s equations predicted the existence of electromagnetic waves that travel at the speed of light, and in 1890, Heinrich Hertz confirmed their existence experimentally. Although he was of the opinion that his discovery had no practical uses, radio broadcasting flowered just 30 years later, as did telephony, electric motors, and the electric grid, which we have come to take for granted. The Age of Electricity had arrived, and it flourishes to this day.
Semiconductors, halfway between conductors and insulators, were a somewhat neglected area of research, yet they turned out to hold the key to the digital world we now live in. The electrical current flowing in a semiconductor such as a silicon crystal is small compared to currents in metals, but it can be manipulated (for example, switched on or off) by applying a second electrical signal to the crystal. Beginning in 1949, experiments that demonstrated this trick in crystalline semiconductor devices were taking place at Bell Labs in New Jersey, where the term transistor was coined for such devices, and where three researchers were awarded the Nobel Prize for physics for creating the first one.
At that time, AT&T, the company that owned Bell Labs, was using millions of mechanical relays and vacuum tubes in the switching offices of its vast and growing telephone network, and it was eager to replace them all with transistors that were smaller, faster, cheaper, and consumed less power. The same applied to the fledgling computing community. The semiconductor industry mushroomed, and before long hundreds, then thousands, and nowadays even millions of tiny transistors were bundled into specialized circuits known as integrated circuits, or simply, chips. Now, 70 years after Walter Brattain painstakingly fabricated the first transistor under a microscope, we are awash with transistor-based contrivances of every kind, and they have become integrated into the very sinews of our society—reason enough to select the transistor as the fifth landmark of technology.
Looking back on these five technological milestones, it is evident that the intervals between successive markers are getting shorter and people have less and less time to adjust to new technologies. The smartphone is only about a decade old, yet it affected human behavior profoundly and has become an almost integral part of one’s self for many people. Other imponderable innovations, such as artificial intelligence, are looming on the horizon, and for better or worse, we will have to learn to live with them, too.
Josef Eisinger, a physicist and molecular biologist, is professor emeritus at the Mount Sinai School of Medicine in New York. Born in Vienna, he is the author of over 150 scholarly articles ranging from nuclear physics to the history of science. A long-time Village resident, he is also the author of Einstein on the Road, which is based on Einstein’s candid travel diaries; and of Einstein at Home, which draws on the recollections of the Einstein family’s housekeeper in Berlin (Prometheus Books 2011, 2016); and of his own memoir, Flight and Refuge: Reminiscences of a Motley Youth (Amazon 2016).