When we run into trouble with technology, it’s usually because we’re using it at the wrong place and time. Texting while driving is an obvious example. Texting is a wonderful way to keep in touch, but thinking we can operate machinery while we do it is a deadly mistake. And yet the human desire to multitask seems hard-wired in many of us, which is why we daydream when doing common tasks. The trick is figuring out what our limits are.
Statistical studies can give us a good read on the dangers we run. Virginia Tech researchers looked at the results of a government-sponsored study in which 3,500 people let researchers mount cameras and other sensors inside their vehicles to track their behavior. The study included more than 35 million miles of data and followed its drivers for three years.
Here’s what we learn: Texting while driving leads to a risk increase by a factor of six. Just talking on the phone more than doubles your risk of an accident. And dialing a cellphone while driving increases the odds by a factor of 12. Virginia Tech’s research indicates that 36 percent of traffic incidents wouldn’t happen if drivers could avoid being distracted, and while high tech is only one component in distraction (alcohol is obviously one as well), its risks are all too apparent.
We can make things better if the tech improves, and voice-activated car phones are a step in the right direction. But we’re also in the middle of shaking out new technologies, as the recent fatal crash involving a Tesla Model S with self-driving features reminds us. I think that automatic cars are an inevitable trend, a way of reducing the number of fatalities – 32,675 in crashes on roadways in 2014 – and unclogging our much overburdened highways.
But behavior is challenged as new features evolve. Numerous auto companies offer adaptive cruise control, some with automatic braking and assisted steering. In Tesla’s case, the Autopilot system was in use when Joshua Brown was driving on a highway in Florida. The car evidently did not activate braking when a white truck pulled out in front of the vehicle. Tesla Autopilot requires drivers to put hands on the wheel periodically and reminds them to do so.
The Autopilot system is designed to keep a car in a lane and adjust speed to manage traffic situations. It’s not a fully automated solution, but the danger is that drivers who become distracted act as if they can rely on it. We’ll learn what happened as the National Highway Traffic Safety Administration is investigating the crash, homing in on forward-collision warning capabilities and automated braking. What’s occurring here is that we are at the juncture between truly self-driving cars and cars with automated features meant merely to assist drivers.
This is a crucial distinction. Autopilot didn’t detect the truck turning in front of the car, but its driver should have. What we have to avoid is the perception of complete automation (and hence safety), which allows some drivers to fall victim to the myriad distractions of our digital age. Tesla’s Autopilot is off by default and the company warns motorists to monitor driving conditions when it is on, but as with texting, warnings are seldom enough to prevent accidents.
We have to decide what kind of regulatory restraints these technologies require as they develop, because a distracted driver is a menace to everyone. There are places where multitasking clearly doesn’t work, and the roads are just one of them. The problem is that our daily tools from Twitter to popular new games like Pokémon Go are feeding the distraction as we try to mesh virtual encounters with an external world that is hard-edged and physical. We can figure this out as we have with previous technologies, but expect tough lessons along the way.
Paul A. Gilster is the author of several books on technology. Reach him at firstname.lastname@example.org.