Safer with or without a steering wheel in autonomous vehicles?
Taking manual control in critical situations in dangerous. The car has to win passengers' confidence, say Norwegian researchers.
Recent Tesla crashes while in autopilot mode are putting consumer reactions to the test. This is happening just as Americans have started to become more comfortable with the idea of letting technology take over the wheel.
The accidents are raising an important discussion about self-driving cars: should the driver still be able to take over the controls or is that too dangerous?
“Imagine that you run into fog and poor traction. The idea that you suddenly take control of the vehicle when it’s about to spin - it's absurd,” Martin Steinert tells forskning.no.
Steinert is a professor at NTNU and directs the multi-disciplinary research group TrollLabs. Among other projects, the group is researching technology in autonomous vehicles and ships.
“Just think about it! How hard should you brake? You’d need a few seconds,” says Steinert.
Driver or passenger?
Two seconds in a critical situation at high speed is too long. One tragic example is the fatal accident in Florida where a Tesla on autopilot collided with a trailer.
Should drivers continue as drivers or become passengers instead?
Tesla has chosen the first option. Drivers still need to keep their hands on the wheel, or the car turns on the hazard lights, slows down and stops.
The control system is a tool - a kind of extended cruise control. The driver is still needed.
Google has chosen the opposite route, building its cars with no steering wheel. Steinert supports this option.
“Are you able to perform any better than your computer? Probably not,” he says.
Steinert gives examples from his time at Stanford. Chris Gerdes, among other researchers, worked with autonomous cars on the rally track there for many years.
Already back in 2008 the student car P1 had mastered drifting in tight circles on the loose, gravelly surface.
“P1 is able to keep the skid in check. It’s unbelievable. A human driver couldn’t do that,” says Steinert.
Later, Shelley - a converted Audi TT – showed that a computer is as good on the racetrack as the best rally driver.
Video from Stanford University. Chris Gerdes tells and shows footage of the student car P1 and Shelley on the racetrack.
Trusting Tesla owners
But public confidence is won in chaotic everyday traffic, not on the racetrack.
Videos on the Internet suggest that maybe that trust has already gone too far.
See Motorists Play, Read and Relax In Self-Driving Cars As Second Tesla Crashes
People read, ate or slept at the wheel of their Tesla on autopilot — and some even moved over to the passenger seat. With Tesla's latest updates this is no longer possible. Now drivers need to keep their hands on the wheel.
Will the accident in Florida make people more cautious? Maybe. But for every foolhardy Tesla owner there are many fearful passengers. What will get them into a car without a driver – or a steering wheel?
Steinert and his colleagues are working on the answer. In 2014 and 2015 they built a prototype of an autonomous car that shows what it sees – and what it plans to do.
TrollLabs at NTNU, Stanford’s Center for Design Research and Renault collaborated on the project.
LIDAR and lighting strip
The Renault vehicle was equipped with a wide lighting strip under and along the sides of the dashboard. When an obstacle appeared in the front or on the side of the car, the light strip glowed white on the same side.
“The light strip wasn’t providing exact information. It was only making you aware that the car detected something,” says Steinert.
Video of the project showing the system in operation.
The obstacles were detected with a LIDAR system, which works like radar but uses light from a laser instead of short radio waves.
The Renault in the experiment had three LIDAR instruments that provided 3D information on the lane ahead and gave signals to the light strip. A pedestrian in the road, for example, became a diffuse lit spot on the light strip in the direction of the pedestrian.
Google's autonomous car has a LIDAR array with as many as 64 detectors. It gives a more detailed picture around the vehicle, but also costs more. Tesla uses ordinary cameras for tracking rather than LIDAR technology.
The experimental car also had a footplate at pedal height. When detecting an obstacle, the autopilot raised the footplate ever so slightly.
“A human driver would lift his foot and hold it over the brake pedal to be ready. The footplate gave a friendly little jerk, as if to say, ‘Hey, I'm ready,’” explains Steinert.
Stephanie Balters used the driving simulator at Stanford University in her research. Here she is behind the wheel of the simulator. (Photo: Stephanie Balters)
Steinert and his colleagues call this an example of affective engineering. Machines needs to interpret people’s feelings and take them into account.
That’s easier said than done. Emotions are chaotic. How can a car — or other equipment for that matter — know if you're feeling safe or scared?
Steinert has written an article with colleague Stephanie Balters about this in the Journal of Intelligent Manufacturing.
iMotions video shows how the software is used in Stanford’s driving simulator to obtain data on drivers’ reactions.
Distinguishing emotions with infrared
The article explains how to measure emotional reactions in the human body. Strong emotions increase the heart rate and dilate our pupils.
Sweating can also indicate strong emotions, and it conducts current. So the sweat is measured as less electrical resistance in the skin.
The problem is that strong feelings can be positive or negative. The measurements don’t distinguish between exultation and horror, or whether emotions are caused by traffic or something you hear on the news.
To determine this difference, scientists have until recently had to interview subjects afterward.
Now they can also make use of near-infrared spectroscopy, or NIRS, a non-invasive brain imaging method.
“The skull is transparent to the infrared rays. We can see the blood flow in the brain,” says Balters.
The NIRS rays can be combined with traditional EEG measurements, which are linked to computer software that constructs models for detecting emotions.
Another computer program, called iMotions, analyzes eye movements, facial expressions and other body sensors to interpret emotions better.
The driving simulator lets researchers test how the driver can handle current and future cars.
However, the reality out in chaotic traffic is far different than the controlled conditions in a simulator.
If you’re driving in the hot sun, your sweating may be unrelated to your emotional state. Or your physical movement, not your emotions, may speed up your pulse.
Another problem is that the results of one test can’t be compared with another, because “people perceive situations differently. Not everyone responds the same way when they get into a self-driving car,” says Balters.
“Some people trusted the system and were passive in the simulator. Others were more engaged,“ she says.
Still in driving school
Learning to trust autonomous cars is one aspect. But autonomous cars are in training, too, and still learning to recognize dangerous situations.
As Balters explains, the system for avoiding collisions is programmed to monitor the areas down low in front of the car. In the Florida accident, due to the high ride height of the trailer, the radar didn’t perceive it as an obstacle and alert the driver to take over.
Once autonomous cars actually manage to pass the driving test in all situations at some point in the future, how will they show passengers that they can relax - even if the steering wheel is gone?
Maybe it will be with light signals and tilting of the footplate, as in the Renault experiment. Or perhaps the car will talk to the passenger.
According to a paper by Steinert and colleagues in the Journal of Interactive Design and Manufacturing, people preferred it when the car explained why it did something, not just what it was doing.
“The car has to learn how each individual reacts. Each person responds differently. It’s not the sensors that are hard to understand, but interpreting them,” says Balters.
Human beings also have to be involved in building this trust.
“You need to learn the car’s behaviour and get used to it. People have to learn the car’s psychology and the car needs to learn the psychology and physiology — i.e. the body’s reactions — of human beings,” says Balters.