Health Care

What’s in a name? When it comes to trusting self-driving technology, a lot

Drivers are far too trusting of self-driving technology, making them too comfortable with distractions behind the wheel — and a major reason why seems to be the names that car companies give to their rudimentary autonomous systems.

That’s one of the main takeaways from a report released Thursday by the Insurance Institute for Highway Safety, a U.S. watchdog agency dedicated to making driving safer.

The report is based on data collected from a telephone poll commissioned by the group last fall, in which 2,005 American drivers were asked about whether they thought certain behaviours would be acceptable behind the wheel, provided the driver were in a car equipped with different types of self-driving technologies.

Crucially, respondents weren’t told what the systems did or what company produced them; rather, they were just told the following driver-assist systems were in place: Autopilot, Traffic Jam Assist, Super Cruise, Driving Assistant Plus and ProPilot Assist.

All of those systems are currently in place on cars sold in the U.S. and all have similar capabilities in terms of adjusting speed, maintaining safe following distances, and some level of automatic steering.

They are also all considered to be Level 2 systems on the industry’s six-point grading scale, where a zero would be a car with no assistance at all, a Level 1 would be something equipped with something as routine, such as cruise control, and a Level 6 would be a fully autonomous vehicle.

Autopilot is Tesla’s system, Traffic Jam Assist is installed on some Audi and Acura vehicles, Super Cruise is from Cadillac, Driving Assistant Plus is for BMWs and ProPilot Assist is a Nissan product.

Kyla Jackson, a member of the Waymo early rider program, a self-driving car service, demonstrates the power button in an autonomous vehicle in Arizona. While many companies are touting self-driving technology, very few vehicles on the road today are truly capable of completely driving themselves. (Caitlin O’Hara/Bloomberg)

All of the systems require drivers to stay fully alert and engaged while behind the wheel, yet according to survey respondents, each gives drivers a licence to misbehave.

None more so than Tesla’s Autopilot. Almost half of the drivers polled by the IIHS said it would be OK for a driver to take their hands off the steering wheel if Autopilot is engaged.

More than one-third said they’d take their feet off the pedals, and about the same number of respondents would allow themselves to look out the window or talk on a cellphone with the system deployed. Almost one in 10 said it’s OK to read a book or watch a movie.

More than one in 20 said they would go as far as having a nap while in the driver’s seat of a Tesla on Autopilot.

All such behaviour is strictly forbidden while the system is in place, according to Tesla, as it is with all the systems tested in the survey. But Tesla’s Autopilot elicited the most lax behaviour, with the IIHS suggesting a big reason for that is the name of the system.

The term “autopilot” comes from aviation, where autopilot systems reduce workloads for pilots on long-haul trips by automating as many routine tasks as possible. Federal Aviation Administration (FAA) rules mandate that a pilot must be able to take over from an autopilot system at any moment, but the term has come to mean something quite different for civilians.

“Even though autopilots in no way replace human pilots, that is exactly the connotation the term autopilot typically brings to mind,” the IIHS report said.

‘While a name alone cannot properly instruct drivers on how to use a system, it is a piece of information and must be considered so that drivers are not misled about the correct usage of these systems.”

While the IIHS report noted that all of the self-driving systems mentioned resulted in a disturbing number of respondents saying it was OK to do certain behaviours, Tesla’s system scored the highest for every action, which is why the company features so prominently in the report.

Indeed, you don’t have to look very hard to find a Tesla driver giving their Autopilot systems ill-advised responsibility.

Social media is replete with videos of Tesla drivers who feel comfortable enough in their cars to take a nap, eat a meal or engage in many other dangerous behaviours.

In March, a Tesla driver in Florida was killed when his Model 3 crashed into the side of a tractor-trailer. A preliminary investigation from the National Transportation Safety Board found that the driver’s hands were not on the wheel when his vehicle went completely under the truck, and the Autopilot system was deployed.

The same was true in a 2016 crash in California in which a Tesla SUV hit a concrete lane divider, and a 2016 crash in Florida, when an Ohio man was killed.

“Tesla’s user manual says clearly that the Autopilot’s steering function is a ‘hands-on feature,’ but that message clearly hasn’t reached everybody,” said IIHS president David Harkey.

“Manufacturers should consider what message the names of their systems send to people.”

Article source: https://www.cbc.ca/news/business/iihs-self-driving-systems-1.5181302?cmp=rss

Loading...

Best Wordpress Plugin development company in India     Best Web development company in India

Related posts

This woman rushed to the aid of a Danforth shooting victim, and ended up in the line of fire

Times of News

Could Alzheimer’s Stem From Infections? It Makes Sense, Experts Say

Times of News

Canadian Blood Services mulling shorter wait time for gay donors

Times of News