Combating Risk on the Wheels

We, as individuals, love nothing more than perfection, In fact, some of us spend our entire lives in trying to get to that point. Now, even though we are such an admirer of perfection, this is a pursuit where failures outnumber the successes by a country mile. Hence, those of us who happen to fall short pivot to the classic trial and error method. We allow our experiences to strengthen our bids for that next step. The lack of a defined target here might make our progress look a little all over the place, but at the same time, it eliminates every restrictive boundary in play, therefore always keeping a possibility of something better alive. If you think that’s what powered us to a groundbreaking creation like technology, you are right! While the concept of technology seems too resolute for it to be fragmented in any way, the truth is we were not reaching the heights that we did, if it wasn’t for all the lessons we derived from our every failed attempt. This experience-driven calculated approach is still a part of the tech fabric, and at the moment, it’s the only solace of a company like Tesla following some sticky recent days.

Tesla’s full self-driving feature is back in the hot waters on the back of a recent accident involving Tesla model Y. The accident took place on 3rd November in Brea, as the vehicle drove ahead with the FSD mode active. There were no injuries reported, but the event did left the car “severely damaged”. To some extent, the crash vindicates all those critics who questioned the safety of this feature when it was first announced. On top of that, Tesla’s decision to test its driver assist feature by putting unqualified drivers on public roads makes the situation seem even bleaker.

“The Vehicle was in FSD Beta mode and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane. the car gave an alert 1/2 way through the turn so I tried to turn the wheel to avoid it from going into the wrong lane but the car by itself took control and forced itself into the incorrect lane,” stated by the vehicle owner’s report on the incident.

Ever since Tesla started shipping out models with FSD mode, there have been recurring accounts of system malfunction. The company’s handling of it hasn’t helped things either, with drivers complaining about a constant need to update the software. Prompted by the Brea accident, U.S. National Highway Traffic Safety Administration has now ordered Tesla to provide more information about FSD beta test, the safety score evaluation process, and nondisclosure agreements that the company was making every participant sign before moving any further.

Share

Related

Setpoint Raises $43 Million in Series A Financing; Aims to Become the Stripe for Credit Business

Human beings have a penchant to excel in many...

Keeping the Dream Alive

A human life may or may not have a...

Can KYC Improve ZTA and the User Experience?

Understanding ZTA User convenience and security have long been at...

Notifi Raises $10 Million in Seed Financing; Hopes to Fix the Broken Web3 Communications Model

Human beings are known for a myriad of different...

Finding that Next Piece for the EV Puzzle

If there is one thing human beings know best,...

Making AI the Centerpiece of a New-Look App Development Industry

Oracle has officially confirmed general availability of its new...

Rethinking the Streaming Space

The human arsenal tends to possess a wide assortment...

Sharpening the Nuances of Microscopic Observation to Stir Up More Accuracy

Nikon Instruments Inc has officially announced the introduction of...

Latest

No posts to display

No posts to display