Wed. Nov 6th, 2024
alert-–-elon-musk-saved-tesla-from-ruin-by-promising-self-driving-cars-the-problem?-700-crashes,-19-deaths-and-claims-of-a-faked-promo-video.-so-tom-leonard-investigates:-is-his-empire-speeding-towards-disaster?Alert – Elon Musk saved Tesla from ruin by promising self-driving cars. The problem? 700 crashes, 19 deaths and claims of a FAKED promo video. So TOM LEONARD investigates: Is his empire speeding towards disaster?

Back in March 2019, Elon Musk was nothing like the world’s richest person and in fact he feared he was heading for financial ruin.

Tesla, the electric car company that had chiefly made his fortune, faced an uncertain future, having just issued its largest recall yet after critical problems with steering on its ‘Model S’ cars were discovered.

Meanwhile, Musk’s erratic behavior had drawn the attention of regulators, and Tesla’s credit rating had been downgraded following concerns over its ability to meet production targets.

As author Walter Isaacson revealed in his recent biography of the billionaire entrepreneur, Musk couldn’t sleep with worry.

Each night, he sat awake on the edge of the bed he shared with his then-girlfriend, singer Grimes, trying to conjure up a solution.

Then one morning, Isaacson writes, Musk announced to Grimes: ‘I solved it.’

Tesla, he said, would hold an ‘Autonomy Day’ in a month’s time to show investors that the company was building an entirely self-driving. It would be revolutionary – and surely stave off bankruptcy.

Back in March 2019, Elon Musk was nothing like the world’s richest person and in fact he feared he was heading for financial ruin. Tesla , the electric car company that had chiefly made his fortune, faced an uncertain future. As author Walter Isaacson revealed in his recent biography of the billionaire entrepreneur, Musk couldn’t sleep with worry.

Each night, he sat awake on the edge of the bed he shared with his then-girlfriend, singer Grimes (pictured), trying to conjure up a solution. Then one morning, Isaacson writes, Musk announced to Grimes: ‘I solved it.’ Tesla, he said, would hold an ‘Autonomy Day’ in a month’s time to show investors that the company was building an entirely self-driving. 

However, there was an obvious problem: drivers were already dying in fatal accidents in which they appeared to have dangerously over-estimated Tesla’s existing self-driving functions.

What’s more, Tesla staff knew that a fully self-driving vehicle remained a long way off but, Isaacson says, Musk pushed them hard to find a solution.

‘“We have to show people this is real,” he said, even though it wasn’t yet,’ writes Isaacson.

In the end, grand plans to show a car driving itself unaided around the streets of Silicon Valley had to be scaled back to a simple film demonstrating a few modest turns around Tesla HQ.

Nonetheless, Musk told investors that he was only a year away from creating a fully autonomous vehicle.

Tesla, of course, survived and – with a market value of $760 billion – is now the world’s second biggest electric car maker.

However, that question of whether some Tesla customers have been misled about their cars’ self-driving abilities hasn’t gone away. In fact, four years on, it is suddenly more pertinent than ever.

For a new investigation by the Washington Post has shockingly claimed that, since it was introduced in 2014, Tesla’s self-driving software ‘Autopilot’ has been involved in more than 700 crashes, at least 19 of them fatal. (The Post obtained these statistics from an analysis of National Highway Traffic Safety Administration data.)

As a result, Tesla currently faces 10 known lawsuits, each alleging crucial flaws in Autopilot which they say the company failed to deal with while grossly exaggerating its capabilities to the public.

In similar cases that have already come to court, Tesla lawyers have argued that the driver is ultimately in control of the vehicle and must always be paying attention.

Last month, Tesla attorney Michael Carey told an on-going California trial over the death of 37-year-old Tesla-driver Micah Lee that Autopilot is ‘basically just fancy cruise control’.

Nonetheless, this blizzard of legal action now not only risks potentially denting Tesla’s reputation and bank balance – but, after his recent troubles with Twitter, adds to the mounting questions Musk faces about his stewardship of major global brands.

One legal case – brought by the family of a 2019 crash victim – could be particularly explosive, and not least because it is supported by the explosive deposition of two Tesla engineers who claim company leadership not only knew that their software had potentially critical limitations, but that they did nothing to fix them.

It was early on Friday March 1, 2019 – a little after 6am and still dark.

Jeremy Banner, a 50-year-old software engineer and father of three, was driving to work when he plowed straight into and under a tractor trailer that crossed in front of him. He was killed instantly.

Banner was driving a Tesla ‘Model 3’ and had switched on its ‘Enhanced Autopilot’ system just 10 seconds before the crash.

However, that technology allegedly failed – neither noticing the huge tractor-trailer pulling out ahead nor applying the brakes in time to prevent a crash.

Tesla’s user documentation states that ‘Autopilot’ tech isn’t designed to be used on a highway with cross-traffic such as the one Banner was on, the US 441.

However, drivers can still activate it as Banner did.

It was early on Friday March 1, 2019 – a little after 6am and still dark. Jeremy Banner, a 50-year-old software engineer and father of three (pictured left and, right, with his wife), was driving to work when he plowed straight into and under a tractor trailer that crossed in front of him. He was killed instantly.

Banner was driving a Tesla ‘Model 3’ and had switched on its ‘Enhanced Autopilot’ system just 10 seconds before the crash. However, that technology allegedly failed – neither noticing the huge tractor-trailer pulling out ahead nor applying the brakes in time to prevent a crash.

Tesla’s user documentation states that ‘Autopilot’ tech isn’t designed to be used on a highway with cross-traffic such as the one Banner was on, the US 441. However, drivers can still activate it as Banner did.

On the large computer screen in front of him, a standard message would have told him to keep his hands on the wheel and ‘be prepared to take over any time’. However, crash investigators latterly discovered that his hands weren’t detected for the last eight seconds before impact.

Ahead of Banner, the truck began to cross the highway. The driver later admitted that he had slowed but not fully stopped at a stop sign.

Six seconds before the crash, the Tesla’s forward-facing camera captured a snap of the truck crossing the lane ahead. Yet neither Banner nor Autopilot applied the brakes.

The Washington Post investigation calculated that braking just 1.6 seconds before the collision would have been enough to avert disaster.

As the roof of Banner’s car was ripped off, the rest of his vehicle careered under the truck and kept going for another 40 seconds, traveling almost a third of a mile before coming to a final stop on a grassy verge.

The official investigation by the National Transportation Safety Board, (the independent government agency that reviews such crashes) was critical of Tesla.

While it did cite both Banner’s inattention and the truck driver’s failure to stop, it also referenced Banner’s potential ‘over-reliance on automation’ and said Tesla’s design ‘permitted disengagement by the driver’.

Banner’s family go further, claiming in their lawsuit that Tesla was aware their Autopilot tech was ‘not fully tested for safety and was not designed to be used on roadways with cross-traffic or intersecting vehicles’ and yet they intentionally programmed it so that it could still be used on such roads.

Their lawyers are supporting the case with another 2016 crash, in which ex-Navy SEAL Joshua Brown died in what they say was a markedly similar accident.

Also in the Sunshine State, Brown’s Tesla ‘Model S’ failed to spot an 18-wheeler tractor-trailer driving across a highway ahead of him.

Brown, a 40-year-old former bomb dismantler in the Iraq War, also died as the roof of his car was ripped off while passing under the trailer.

In the end, federal investigators decided Tesla hadn’t been at fault, concluding that Brown had his hands off the steering wheel at least 90 percent of the time during the drive, ignoring at least seven safety warnings.

The Banner family are supporting the case with another 2016 crash, in which ex-Navy SEAL Joshua Brown died in what their lawyers say was a markedly similar accident. Also in the Sunshine State, Brown’s Tesla ‘Model S’ failed to spot an 18-wheeler tractor-trailer driving across a highway ahead of him.

Brown, a 40-year-old former bomb dismantler in the Iraq War, also died as the roof of his car was ripped off while passing under the trailer. In the end, federal investigators decided Tesla hadn’t been at fault, concluding that Brown had his hands off the steering wheel at least 90 percent of the time during the drive, ignoring at least seven safety warnings.

The Banners – who are seeking punitive damages from Tesla – are also relying on written deposition from two Autopilot engineers, Nicklas Gustafsson and Chris Payne.

They have both that Tesla’s software wasn’t designed to respond to cross-traffic. According to Payne, Autopilot was only made to be used on highways with central dividers.

Gustafsson said he actually investigated Brown’s 2016 death as part of his job and says that, despite Tesla knowing about this cross-traffic issue, ‘no changes were made’.

In response, Tesla’s lawyers say that, if Banner had been paying attention to the road it is ‘undisputed’ that he could have avoided the crash.

They say Autopilot is safe ‘when used properly by an attentive driver who is prepared to take control at all times’.

The company adds: ‘The record does not reveal anything that went awry with Mr Banner’s vehicle, except that it, like all other automotive vehicles, was susceptible to crashing into another vehicle when that other vehicle suddenly drives directly across its path.’

Many modern cars now come equipped with some form of driver assistance such as ‘adaptive’ cruise controls, which can help vehicles maintain safe following distances or stay within speed limits.

Tesla, however, currently offers something rather more advanced.

Musk’s company has three levels of driver assistance which are available on all its models (if built after September 2014).

‘Autopilot’ is the most basic option and comes free with all Teslas.

It is essentially a sophisticated form of cruise control with abilities such as lane-centering and adapting speed to match surrounding traffic.

The system relies principally on eight cameras positioned around the vehicle that provide 360 degrees of visibility at distances of up to 250 meters – or, as Tesla’s website grandly boasts, ‘a view of the world that a driver alone cannot access, seeing in every direction simultaneously, and on wavelengths that go far beyond the human senses’.

Many modern cars now come equipped with some form of driver assistance such as ‘adaptive’ cruise controls, which can help vehicles maintain safe following distances or stay within speed limits. Tesla, however, currently offers something rather more advanced. It has three levels of driver assistance which are available on all models (built after September 2014).

For an extra $6,000, ‘Enhanced Autopilot’ adds more features – including changing lanes as it sees fit, self-parking, and entering and exiting highways without the driver’s input.

Finally, for $15,000, Tesla offers ‘Full Self-Driving’, its most advanced technology yet.

Tesla claims the software can recognize stop signs and traffic lights, as well as having the ability to drive itself ‘almost anywhere with minimal driver intervention’. The technology, Tesla says, ‘will continuously improve’ itself.

An accompanying video, that has been on Tesla’s website since 2016, shows just such a hands-free journey around Californian roads.

Be that as it may, Tesla adds that all its cars – no matter their level of automation – require ‘active driver supervision’, including keeping your hands on the wheel and being ‘prepared to take over at any moment’.

These features ‘do not make the vehicle autonomous’, the website adds.

That sort of user warning surely undermines claims of ‘full self-driving’ capabilities, say critics.

It also adds to wider accusations – as per the on-going crash lawsuits – that Tesla marketing may have led to a false impression that Musk’s cars have capabilities beyond their means.

As for Musk – who owns some 13 percent of Tesla, accounting for a large part of his estimated $255 billion fortune – his public pronouncements about the brand’s capabilities have also come under increased scrutiny.

Over the years, he has repeatedly boasted that ‘Tesla drives itself’. And in 2016, Musk even told reporters that Autopilot was already ‘probably better than a person right now’.

However, there has been pressure on Musk to clarify that position after a senior Tesla engineer sensationally testified during another on-going crash lawsuit in January this year that the aforementioned company video for the ‘Full Self-Driving’ feature – showing a car driving itself around Californian roads, without the driver’s hands being on the wheel – had been staged.

The engineer, Ashok Elluswamy, said the drive had been faked to demonstrate capabilities – such as stopping and starting at traffic lights – that Tesla technology doesn’t yet have. At the time of publication, the video remained on the company’s website.

There has been pressure on Musk to clarify his claims about Tesla cars after a senior company engineer sensationally testified during another on-going crash lawsuit in January this year that the company video for the ‘Full Self-Driving’ feature had been staged.

The engineer, Ashok Elluswamy (pictured), said the video had been faked to demonstrate capabilities – such as stopping and starting at traffic lights – that Tesla technology doesn’t yet have. At the time of publication, the video remained on the company’s website.

Separately, Tesla is also being investigated by both the National Highway Traffic Safety Administration (NHTSA) and the US Department of Justice over its safety record and claims that customers may have been misled.

Aside from the case brought by the Banners, the company also faces nine other current civil lawsuits concerning the deaths of Tesla drivers, or passengers and bystanders hit by Teslas.

In recent days, it was announced Tesla’s profits have plunged 44 percent, amid growing competition from other manufacturers.

During a downbeat conference call about the poor performance figures last week, Musk was asked about self-driving cars and admitted that ‘obviously in the past, I’ve been overly optimistic’.

So, pressure builds on pressure – and more sleepless nights surely beckon.

error: Content is protected !!