The UN is worried about killer robots. We should be, too.

Editor’s note: Paul Gilster is taking a sabbatical this month. He column will return in January.

The world of consumer electronics can quickly blind us to the dangerous possibilities emerging in fields like artificial intelligence and biotechnology. Obsessed with the latest iPhone or the next generation of gaming devices, we tend to overlook the fact that any technology can have unexpected consequences. We’re seeing that story play out right now at a meeting in Geneva of a United Nations committee called the Convention on Certain Conventional Weapons.

Here, negotiators from around the planet are talking about what can happen when a weapon becomes autonomous. The model is startling on the face of it. Even with devices like drones flying over battlefields, the decision on when to use the weapon against what target is always made by human beings. The fear is that artificial intelligence could empower similar weapons systems to be programmed against specific individuals and operate beyond human guidance.

Some are calling autonomous weaponry the “third revolution in warfare,” the first being the introduction of gunpowder, the second the spread of nuclear arms. In a 2015 open letter backed by the likes of physicist Stephen Hawking and Tesla founder Elon Musk, the issue is put starkly. Noting that, unlike nuclear weapons, autonomous weaponry will not use hard to find materials and will piggyback on existing technology advances, the letter hopes to short-circuit the development of an arms race in this dangerous new sector.

Just what would autonomous drones look like? A new video from the Future of Life Institute, which originated the open letter, gives us an idea. The video, screened recently at the Geneva meeting, portrays the consequences of incorporating facial recognition, miniaturization and automated targeting into today’s drone designs. Acting in swarms, tiny “quadcopters” are envisioned as they seek out individual targets for assassination.

Or imagine authoritarian regimes using such methods to identify and eliminate pro-democracy movements. It’s a horrifying future, and one that could quickly devolve into chaos in the hands of rogue states or terrorists, which is why a coalition of NGOs called the Campaign to Stop Killer Robots is working to stop it. Although the film is speculative, it does no more than extend existing technologies along an all too familiar path of rapid miniaturization and development.

Of course, an automated weapon doesn’t have to be a drone. It could be any weapons system designed to be mobile, a replacement for today’s tanks, for example, or even the individual infantry soldier. The countries meeting in Geneva to ponder these matters have some precedents to work with as they ponder such a disturbing evolution. Biological weapons, for example, were renounced by presidents Johnson and Nixon as a result of similar warnings.

For that matter, runaway technology can easily cross the line between the inanimate and the animate. Consider recent news out of Harvard University, which announced the development of a “kill switch’’ to shut down bioengineered microbes that could potentially morph in ways that cause unintended consequences. Developed for benign purposes, an unchecked mutation could spread the way rabbits spread in Australia after what appeared to be their harmless introduction to that country in the 19th Century. A “kill switch” could prevent that from happening.

The development of such switches would appear to be good news, but it also reminds us that even biotechnology presents another vector for misapplication and weaponization. Never has the question of unintended consequences loomed so large. Artificial intelligence guides our travels and dishes up our music, but we have to be sure its data manipulation isn’t applied for malevolent purposes.

Agreements banning nuclear weapons from space are likewise a precedent, as are those prohibiting the use of laser weaponry to blind people, enacted by the Convention on Certain Conventional Weapons in 1995. We need a ban on autonomous offensive weapons in a similar way. As with nuclear weapons, how to enforce it will become one of the key questions as we move to confront a potential global arms race based on weapons that act on their own.

Paul A. Gilster is the author of several books on technology. Reach him at