Self-driving cars don’t have accidents, people do

A man in B.C. was caught sleeping behind the wheel of a Tesla on a busy highway. The car was self-driving at a speed of 140 km. Many may criticize Tesla, blaming its advance technology for causing accidents, but the problem is actually the drivers’ misuse of the car’s advanced features.

Tesla has two features.

Autopilot is an advanced driving assistance system that helps with safety and convenience. The self-driving system allows the driver to sit back, relax, and rely on technology.

The problem is that too many drivers are sitting back relaxing and not paying attention. The car is made for comfort while driving not just letting it drive all by itself.

People need to think before they do such acts.

According to an article by Forbes, most of the accidents of Tesla cars are taking place due to the drivers falling asleep at the wheel. Not sleeping at the wheel is basic common sense, which is being ignored here completely.

While in the autopilot feature, Tesla requires the driver to put little steering force on the wheel or do anything which makes the AI believe the person is awake, and the car would keep running smoothly. but if it feels the person is not awake, the car will stop abruptly.

If the driver doesn’t make the car believe the driver’s hands are on the wheel for 30-seconds, the car gives a visual warning.

As you go longer, the warning on the screen gets brighter. Within 45 seconds, it will give an audible warning by stopping the car slowly in the middle of the road.

Autopilot features are mainly used on highways, as the roads and the driving is simple to get detected by the AI system. The autopilot feature is 9x times safer than average driving. The company said that it always gives proper instructions on how it should be used when a buyer buys a car. This feature is made for the driver to relax a bit and not just give complete driving responsibility to the AI.

A video from 2016, in Netherlands, a man shares a video of a Tesla stopping all by itself right before a collision between two cars is to take place. It seemed to have sensed an SUV in front of the vehicle, forcing it to stop at the moment.

In June this year, another such incident happened, where a man was saved from colliding with a deer.

The Tesla Video Cam showed the Model3 making a quick halt, while in an auto pilot mode saving the deer. Even though a deer was an animal, the AI though that it was a pedestrian/object and immediately stopped.

It is people who cause these accidents and not the technology.

Falling asleep at the wheel while the AI handles driving is not what the self-driving car is made to do.

The common sense is really missing for some people out there.

NO COMMENTS

LEAVE A REPLY