Combating Risk on the Wheels

We, as individuals, love nothing more than perfection, In fact, some of us spend our entire lives in trying to get to that point. Now, even though we are such an admirer of perfection, this is a pursuit where failures outnumber the successes by a country mile. Hence, those of us who happen to fall short pivot to the classic trial and error method. We allow our experiences to strengthen our bids for that next step. The lack of a defined target here might make our progress look a little all over the place, but at the same time, it eliminates every restrictive boundary in play, therefore always keeping a possibility of something better alive. If you think that’s what powered us to a groundbreaking creation like technology, you are right! While the concept of technology seems too resolute for it to be fragmented in any way, the truth is we were not reaching the heights that we did, if it wasn’t for all the lessons we derived from our every failed attempt. This experience-driven calculated approach is still a part of the tech fabric, and at the moment, it’s the only solace of a company like Tesla following some sticky recent days.

Tesla’s full self-driving feature is back in the hot waters on the back of a recent accident involving Tesla model Y. The accident took place on 3rd November in Brea, as the vehicle drove ahead with the FSD mode active. There were no injuries reported, but the event did left the car “severely damaged”. To some extent, the crash vindicates all those critics who questioned the safety of this feature when it was first announced. On top of that, Tesla’s decision to test its driver assist feature by putting unqualified drivers on public roads makes the situation seem even bleaker.

“The Vehicle was in FSD Beta mode and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane. the car gave an alert 1/2 way through the turn so I tried to turn the wheel to avoid it from going into the wrong lane but the car by itself took control and forced itself into the incorrect lane,” stated by the vehicle owner’s report on the incident.

Ever since Tesla started shipping out models with FSD mode, there have been recurring accounts of system malfunction. The company’s handling of it hasn’t helped things either, with drivers complaining about a constant need to update the software. Prompted by the Brea accident, U.S. National Highway Traffic Safety Administration has now ordered Tesla to provide more information about FSD beta test, the safety score evaluation process, and nondisclosure agreements that the company was making every participant sign before moving any further.

Share

Related

Regie.ai Secures $10 Million in Series B Financing; Plans to Bring an AI Punch to Your Marketing Copy

Human beings have always enjoyed a ton of valuable...

Citizen Acquires Harbor for an Undisclosed Fee; Hopes to Rejuvenate Market Position

As a society, we cannot afford to not replenish...

Rolling with the Changing Times

There is little denying the fact that human capabilities...

A Long-Awaited Crypto Entry

There were many things that went into making human...

Modernizing Your Shopping Sprees

One of the greatest things about a human life...

Another Bid at Widening the Spectrum

While humans are known to be good at many...

Automation and Continuous Monitoring Change the Compliance Equation

With daily changes in the status of the ongoing...

Petal Secures $140 Million Investment; Hits $800 Million Valuation

The world around us is far more layered than...

Latest

No posts to display

No posts to display