Combating Risk on the Wheels

We, as individuals, love nothing more than perfection, In fact, some of us spend our entire lives in trying to get to that point. Now, even though we are such an admirer of perfection, this is a pursuit where failures outnumber the successes by a country mile. Hence, those of us who happen to fall short pivot to the classic trial and error method. We allow our experiences to strengthen our bids for that next step. The lack of a defined target here might make our progress look a little all over the place, but at the same time, it eliminates every restrictive boundary in play, therefore always keeping a possibility of something better alive. If you think that’s what powered us to a groundbreaking creation like technology, you are right! While the concept of technology seems too resolute for it to be fragmented in any way, the truth is we were not reaching the heights that we did, if it wasn’t for all the lessons we derived from our every failed attempt. This experience-driven calculated approach is still a part of the tech fabric, and at the moment, it’s the only solace of a company like Tesla following some sticky recent days.

Tesla’s full self-driving feature is back in the hot waters on the back of a recent accident involving Tesla model Y. The accident took place on 3rd November in Brea, as the vehicle drove ahead with the FSD mode active. There were no injuries reported, but the event did left the car “severely damaged”. To some extent, the crash vindicates all those critics who questioned the safety of this feature when it was first announced. On top of that, Tesla’s decision to test its driver assist feature by putting unqualified drivers on public roads makes the situation seem even bleaker.

“The Vehicle was in FSD Beta mode and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane. the car gave an alert 1/2 way through the turn so I tried to turn the wheel to avoid it from going into the wrong lane but the car by itself took control and forced itself into the incorrect lane,” stated by the vehicle owner’s report on the incident.

Ever since Tesla started shipping out models with FSD mode, there have been recurring accounts of system malfunction. The company’s handling of it hasn’t helped things either, with drivers complaining about a constant need to update the software. Prompted by the Brea accident, U.S. National Highway Traffic Safety Administration has now ordered Tesla to provide more information about FSD beta test, the safety score evaluation process, and nondisclosure agreements that the company was making every participant sign before moving any further.

Share

Related

Is it Time to Press the Panic Button on Crypto Mining?

It’s barely a secret that human beings boast a...

IOT Tech Expo Europe welcomes a host of top industry experts to the speaker line-up!

IoT Tech Expo Europe, the leading event for exploring...

GoFreight Raises $28 Million in Series A Financing; Plans to Become the Shopify of Freight Forwarding Space

Human beings are known to excel in many different...

Shooting for the Moon

The dreams and fascinations of a human being are...

Intelligent Automation Conference North America: Machine Automation for Business Success

Intelligent Automation Conference North America is taking place on...

A Hiccup in the Plans

A tragic fact about human life is that it’s...

Will Quantum computing really affect a company’s security posture?

Since the time of Covid, no one goes to...

London Biotechnology Show 2024: Anticipation Soars with Stellar Speakers and Top-notch Exhibitors

London, UK: Excitement is mounting for the upcoming London...

Validating an Intelligible Take on Transforming Your Procurement Operation

Oracle has officially confirmed the designation of being named...

Latest

No posts to display

No posts to display