Do You Know How Well Your Car Drives Itself?

how's this work

Neither your dog nor your car is capable of driving. Do not try to imitate this, unless your attempt involves as much Photoshop as ours.

Michael SimariCar and Driver

From the September 2021 issue of Car and Driver.

You don’t have to go deep down an internet rabbit hole to find evidence that humans will push boundaries.

The relatively recent introduction of semi-autonomous technology in cars has led to all sorts of documented bad behavior, from folks putting water bottles on their steering wheel to drivers letting Jesus take the wheel as they climb into another seat. The former can trick a car into thinking a driver’s hands are where they should be; the latter is wildly dangerous.

When a Tesla Model S hit a tree 550 feet from its starting point in suburban Houston earlier this year, initial reports of the fiery fatal crash suggested no one was in the driver’s seat at the time. The National Transportation Safety Board has since said that security-camera footage shows the driver getting behind the wheel. But even if the ensuing (and somewhat chaotic) coverage of that incident hasn’t clarified exactly what happened, it did expose a hard truth about new automotive technologies: Many people have no idea what their cars can and can’t do. That confusion is clouding the debate about who is responsible when there’s a crash.

In Tesla’s case, the misconception that cars can drive themselves is partially egged on by the company’s CEO, Elon Musk, who has overstated claims. But consumers are guilty of putting too much trust in even the most conservatively marketed systems, as evidenced by the number of Reddit threads and YouTube videos showing how you can outsmart the technology.

As the industry puts more semi-autonomous tech into the hands of the American public, there is a growing need for better driver education and marketing standards that push automakers to clearly explain systems without overpromising. Solving these problems will only become more urgent as more advanced vehicles that actually can drive themselves under certain circumstances begin sharing the road and the marketplace with cars that have much less capability.

“When you tell somebody that they don’t have to be responsible, that this part of the driving task is going to happen for you, you are giving them an indication that they don’t have to pay attention,” says Sam Anthony, chief technology officer and cofounder of Perceptive Automata, a company that helps software for automated vehicle systems to understand human behavior. Anthony, who has a PhD in psychology, says drivers assume computers can act like humans, processing information as quickly as and in the same way that people do. “Neither of those is really true,” he says.

Anthony points to a crash in 2018 in San Jose, California, where a Model S heading south on the 101 slammed into the back of a stopped fire truck. The car’s radar-based cruise control didn’t register the truck because it wasn’t moving. “In human terms, it’s like if you couldn’t see the car in front of you if it stopped,” Anthony says.

“The artificial intelligence in cars isn’t actually that good,” says Gill Pratt, CEO of the Toyota Research Institute. “The reason human beings can do it so well is that we are smart, we can empathize, and we know what other people are most likely to do.” He says AI struggles to predict human behavior, which is the technology’s biggest limiting factor.

In an attempt to give drivers a clear understanding of Toyota’s advanced driver-assistance systems, the company named the suite Teammate to indicate that it is assisting the driver rather than taking over. While that may seem trivial, branding matters when it comes to public understanding.

AAA looked at the marketing terms automakers use for driver-assistance systems and found 40 different names for automated emergency braking, 20 for adaptive cruise control, and 19 for lane-keeping assist. The 2019 report claims this makes it “difficult for consumers to discern what features a vehicle has and how they actually work.” And previous research by AAA found that when a partially automated driving system’s name includes the word “pilot,” 40 percent of Americans expect the car will be able to drive itself. No one interviewed for this story wanted to comment on Tesla specifically, but given its use of the terms “Autopilot” and “Full Self-Driving Capability” and in light of AAA’s findings, Tesla’s marketing may lead people to over­estimate what its cars can do.

We may be on the cusp of stand­ardizing names. In April, the Alliance for Automotive Innovation, a trade and lobbying group for the auto industry, published guidelines for Level 2 driver-monitoring systems. The group acknowledged consumer confusion about what cars can do and the resulting complacency in and abuse of the technology. It recommended that automakers give their systems names that “reasonably reflect the functionality” and don’t “imply greater capability.”

“Some of the high-profile crashes we’ve seen where drivers weren’t appropriately engaged are eroding consumer acceptance and confidence in these systems,” says John Bozzella, president and CEO of the alliance. These measures aim to combat that.

But marketing and naming guidelines can do only so much. Auto­makers may eventually need to offer customers formal training. David Mindell, a professor of aeronautics and astronautics at the Massachusetts Institute of Technology and author of Our Robots, Ourselves: Robotics and the Myths of Autonomy, has watched industries like aviation and deep-sea exploration adapt to automation. Businesses in those fields understand the importance of training when new technologies are introduced. When operators don’t receive proper instruction, the results can be catastrophic. Consider the recent Boeing 737 Max crashes; a lack of pilot training contributed to those disasters.

Mindell puts it into perspective, noting that while pilots must take recurrent trainings every year, “I’ve had my driver’s license since age 16 and haven’t had a day of training since. Which is a remarkable thing when you think about how you operate complex deadly machinery, which is what cars are.”

But ultimately, people will continue doing stupid things for stupid prizes like adrenaline rushes and internet infamy. “Any safety feature sort of puts constraints on the driver or the vehicle,” says Mindell. “People will try to push those limits, even if it’s for no other reason than making YouTube videos.”

Katherine E. Ackerman

Next Post

Law Information, Legal News, Court Docket News, Litigation Stories & Regulations

Wed Aug 11 , 2021
Students within the clinic coordinate their efforts with UT graduate college students of ecology, environmental design, wildlife ecology, and other disciplines. The coordination permits for integrated environmental decision-making and problem-solving and offers students a possibility to enhance their ability to know, communicate with, and affect other disciplines. Environmental attorneys take […]

You May Like