The Nissan Leaf is cruising at 35 mph when a pedestrian jumps into the roadway.
But theres no one at the controls. Instead, radar, lasers and cameras recognize the pedestrian actually a dummy shoved into the road by an engineer. Computers order the car to slam the brakes and swerve, avoiding a collision.
The recent demonstration, at a former military base in Irvine, Calif., underscored just how far automakers have come in developing cars that drive themselves. Car companies including Nissan, General Motors and Mercedes have logged thousands of miles of successful tests, with an eye toward selling autonomous vehicles by 2020.
Nissans test provided a vivid display of whats already possible: The Leaf dropped an occupant at the store, then proceeded to drive itself down a parking row, stop for an SUV driven by a human, and back into a space.
But the technology is just one of many challenges. Convincing consumers, regulators, insurers and lawyers that autonomous vehicles are safe and determining who pays when they crash could wrap their future in a Gordian knot.
It is uncharted waters, said James Yukevich, a Los Angeles attorney who defends the auto industry from product liability lawsuits. I dont think this is an area very many people have thought much about.
Coddled by robotic chauffeurs, would people retain the driving skills to take over in emergencies? Who would be liable if an autopiloted car runs through a crowd of pedestrians: the owner or the automaker? Would insurance premiums go up or down? Would cyberterrorists figure out how to make Fords blast through school zones at 100 mph?
Are human drivers really ready to give up control?
Such thorny questions cast doubt on automakers ambitious timelines, said Bryan Reimer, a scientist and transportation expert at MIT.
Humans can deal relatively well with humans making mistakes, but we dont deal as well with robots making mistakes, Reimer said. How many of us are willing to get on an airplane with no pilot even though half the time the pilots are just sitting around watching the automation?
Must be fail-safe
It may seem inevitable that machines will one day pilot cars more safely than humans. But that will have to be proved beyond doubt before legislators and regulators give them free rein. The engineering will have to be fail-safe, said David Strickland, administrator of the National Highway Traffic Safety Administration.
For now, just three states California, Florida and Nevada allow self-driving cars on the road and only for testing. Six states have rejected testing, and seven others are considering regulations, according to the Center for Internet and Society at Stanford University.
California has directed its Department of Motor Vehicles to craft regulations by the start of 2015, said Bernard Soriano, deputy director for the agency. The state is working with the NHTSA, the highway patrol and the states departments of insurance and transportation to figure out the regulations.
Automakers are moving in stages toward fully autonomous cars. They started more than a decade ago with features such as electronic stability control, which assists with braking to help drivers control the car.
Some 2014 Mercedes-Benz and Acura models combine adaptive cruise control, which keeps a vehicle at a safe distance from cars ahead, with lane keeping, which automatically adjusts steering. Another system slams the brakes before an impending crash.
The next level of development: cars that assume full control under favorable traffic or weather conditions. These are the self-drivers Nissan expects to sell by 2020, said Maarten Sierhuis, director of the automakers Silicon Valley research center. The NHTSA is working on a four-year timetable to issue regulations for such cars.
All this leads to the final frontier: vehicles that can operate completely on autopilot even without passengers, as in automated taxis or delivery vans.
Handling unexpected conditions
The challenge for automakers will be programming cars to navigate complicated and unexpected conditions, Sierhuis said.
It is always the outliers the very complex traffic situations that are hard to imagine and hard to test that are difficult, he said.
Sierhuis plans to test the autonomous Leaf at a section of road near his Sunnyvale, Calif., office that he calls the monster. Human drivers cause an accident a week there, a high-volume road with a highway crossing and three intersections in quick succession.
As it formulates rules, Californias DMV is looking for guidance from regulations governing robotic surgery and the potential use of commercial aerial drones, which are seeing limited use for photography in Canada.
Beyond regulatory hurdles, car companies could be taking on a huge liability in selling robotic cars.
If the driver has no control of the vehicle at all, how is it possible for that person to be negligent? Yukevich said. You can sit there and read the newspaper. If there is an accident, you cant be at fault.