Opinion

Two cheers for Alexa and Siri. May the revolution begin.

Toyota Motors’ communication robot Kirobo Mini is equipped with artificial intelligence and a built-in camera. The robot is capable of recognizing the face of the person speaking to him and responding in unscripted conversation or even starting a chat.
Toyota Motors’ communication robot Kirobo Mini is equipped with artificial intelligence and a built-in camera. The robot is capable of recognizing the face of the person speaking to him and responding in unscripted conversation or even starting a chat. AFP/Getty Images

There are many signs that in 2018 artificial intelligence—AI—came of age. To be sure, seasoned techies might yawn and say, “We’ve heard that tune before.” But between all the talk of autonomous vehicles, of 3D printing, and of robots taking our jobs, things may in fact be different this time.

One “soft” indicator that this may be true relates to the fact that, according to a University of Maryland sociologist, the popularity of the name Alexa has crashed since the Amazon virtual assistant was introduced. After all, who wants to name your kid after a natural- language interface? In case anyone is wondering, the name Siri, never very popular to begin with, has also plummeted.

Moreover, it is likely that the digital versions of Alexa and Siri are about to become much more prominent. According to a piece in the November 2018 issue of The Atlantic, the British tech consultancy Ovum predicts that by 2021 “there will be almost as many voice-activated assistants on the planet as people.”

Closer to home, another indicator that this might in fact be the year of AI can be seen in the robust response to a free community program on AI recently hosted in Chapel Hill by the UNC-Chapel Hill General Alumni Association. The speakers—a computer scientist, a legal expert, a philosopher, and an economic historian—spoke in a packed auditorium to an enthusiastic audience comprised of a broad mix of people. As the economic historian in the house, I was brought in, I think, to make the audience aware that we have not just survived earlier “disruptive technologies,” but thrived as a result of them.

To be sure, AI is kind of scary, especially when considered part of the so-called 4th Industrial Revolution. This revolution—itself an outgrowth of the “3rd Industrial Revolution” (computers, semiconductors, digitalization, etc.)—includes a variety of converging technologies in a number of fields, most notably, robotics, machine learning, quantum computing, 3D printing, nanotechnology, biotech, autonomous vehicles, and, of course, AI. Indeed, the profound and potentially deleterious economic and ethical implications of AI and the 4th IR have led some very distinguished people—the late Stephen Hawking, and the very much alive Bill Gates and Elon Musk among them—to pessimistic, even dire conclusions about AI going forward.

Although I appreciate their concerns, in the past human life has improved as a result of “disruptive technologies” that often evoked anxiety and/or social upheaval early on. Here, think the systematic employment of fire, the “invention” of agriculture, the wheel, gunpowder, the steam engine, mechanized factories, railroads, electricity, the internal combustion engine, the jet engine, and computers. Some of the above technologies were thought to be dangerous, others to be job killers, but in the end all proved safe, engendered productivity gains that bettered human living standards, and led to the creation of more jobs.

Interestingly, it is often only in hindsight that we realize which technologies are the most important, if not necessarily the most disruptive to human beings. Economist Robert Gordon has argued convincingly that public sanitation—clean piped water and sewage systems—may have been the most important source of rising American living standards in the first half of the twentieth century.

And the renowned historian of technology Vaclav Smil makes the provocative, but credible claim that the Haber-Bosch process (1910) for making synthetic nitrogen from ammonia was the most important technological innovation of the entire 20th century. Why? Because ammonia, when used for making synthetic fertilizer (rather than explosives) was largely responsible for revolutionary gains in agricultural yields that allowed us to feed a world population that grew from 1.6 billion in 1900 to over 6 billion by 2000. Who knew?

So, I’m cautiously optimistic about AI and the 4th IR. In other words, two cheers for Alexa and Siri. And three cheers for the General Alumni Association, the other panelists on the recent program, and the assembled audience, who together showed what community education at its best is all about.

Peter A. Coclanis is Albert R. Newsome Distinguished Professor of History and Director of the Global Research Institute at UNC-Chapel Hill.









  Comments