As self-driving cars become more widely accepted, experts warn: They aren’t fail-safe


The idea of self-driving cars is a concept that more Americans are coming to accept. According to an annual survey by the American Automobile Association (AAA), 63 percent of American drivers would be afraid to ride a self-driving car; compare this to last year, when the number was around 78 percent. Despite the increasing approval towards autonomous vehicles, experts have stated that we shouldn’t be totally comfortable with them just yet. In fact, becoming too comfortable with self-driving cars may be a huge problem.

Every batch of autonomous vehicles comes outfitted with new capabilities. From obstacle detection to lane centering, there’s little doubt that they’re growing in sophistication. However, self-driving cars have only come so far and still have a long way to go — a fact that not every driver keeps in mind.

“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees. Additional autonomy gives people a sense that something else is in control, and we have a tendency to overestimate the technology’s capabilities,” said Nidhi Kalra, a senior information scientist at the RAND Corporation. (Related: Safety not a major concern as House gives self driving car bill the fast lane)

Just look at the recent accident involving a Tesla Model S and a parked fire truck. As per FoxNews.com, the incident occurred in Culver City, Calif., at Interstate 405. The Tesla was said to have smashed into the fire truck, which was there in response to an accident on the freeway. Culver City Fire Battalion Chief Ken Powell reported that the car sustained significant damage, but the driver appeared to have no serious injuries. Speaking of the crash, the driver of the Tesla claimed that it was operating in “autopilot” mode. The National Transportation Safety Board (NTSB) is now looking into the incident.

When approached for comment on the investigation, Tesla remained mum on the matter. The company also refused to state whether or not the autopilot was indeed functional at the time of the crash. However, Tesla did add that drivers must still remain attentive and keep their hands on the steering wheel while the autopilot is engaged.

This is the newest in a spate of vehicular accidents involving people becoming overly reliant self-driving cars, though it certainly won’t be the last. In response to these occurrences, the automakers themselves have taken extra steps to ensure that distracted driving is kept to a minimum.

General Motors (GM), for instance, developed eye-tracking technology to make sure that drivers have their eyes on the road. Tesla itself equipped its cars with a feature that would lock out drivers from Autopilot if they continuously ignored the warnings to grip the steering wheel. Other suggested features include adaptive cruise control and automatic emergency braking, both of which have garnered immense public support.

Will this be enough? Only time will tell. If anything, these features show that human input is still necessary and those self-driving cars are far from becoming fully autonomous. So don’t get used to sleeping in the car just yet, no matter what Elon Musk may tell you.

Go to Inventions.news for more news stories about up-and-coming technologies, self-driving cars included.

 

Sources include:

DailyMail.co.uk

FoxNews.com

HoustonChronicle.com



Comments
comments powered by Disqus

RECENT NEWS & ARTICLES