W końcu nadszedł czas na rachunek Elona Muska

cyberfeed.pl 1 tydzień temu


For almost as long as he’s been CEO of Tesla, Elon Musk has been bullshitting us about self-driving cars.

In 2016, he said Tesla self-driving cars were “two years away.” A year later, it was “six months, definitely,” and customers would be able to actually sleep in their Tesla in “two years.” In 2018, it was inactive a “year away” and would be “200 percent safer” than human driving. In 2019, he said there would be “feature complete full self-driving this year.” There hasn’t been a year go by without Musk promising the imminent arrival of a full driverless Tesla.

This week, it’s yet here. Or at least that’s what Musk says.

On October 10th, Tesla will uncover its long-awaited “robotaxi,” a supposedly full autonomous vehicle that Musk has said will catapult the company into trillion-dollar status. It will be any combination of “Uber and Airbnb,” Musk said during a fresh earnings call, allowing Tesla owners to service as landlords for their driverless cars as they roam about the cityscape, picking up and dropping off strangers. And it will be futuristic in its design, with Bloomberg reporting that it will be a two-seater with butterfly wing doors. Musk has been calling it the “Cybercab.”

The event, which will be held on the movie lot of Warner Bros. in Burbank, California, will be the culmination of almost a decade of blown deadlines and broken promises from Musk, a minute erstwhile the richest man in the planet will yet be forced to halt hiding behind his own bluster and actually show us what he’s been working on.

It’s a susceptible time for Tesla. The company’s sales slumped in the first half of the year, as rising competition in the US and China dimmed Tesla’s star. Musk is fighting to reclaim his tremendous $56 billion pay package, all while spreading misinformation on his social media platform and stumping for erstwhile president Donald Trump. And now there’s this product event, Tesla’s first since the unveiling of the Cybertruck in 2019.

Almost a decade of blown deadlines and broken promises

Based on past Tesla events, don’t anticipate Musk to follow through on all his promises.

It seems likely that we’ll see a cool demo of a stylish-looking prototype, allowing Musk to claim a kind of triumph for first impressions, even erstwhile the rough outlines of what he promises will barely hold up to scrutiny. The exaltations from bullish investors will give him adequate cover to proceed to make misleading declarations about what is and isn’t autonomous. And the safety experts and competitors who effort to inform about the dangers of his approach will likely be drowned out or dismissed by his most ardent fans.

But either it works or it doesn’t. Waymo and others have already shown the planet what real driverless technology looks like. It’s imperfect and it’s limited, but it’s undeniable. If Musk fails to deliver or shows off any apparent vaporware, his reputation — and Tesla’s stock price — could take a real hit.

“He’s truly grasping for straws,” said Mary “Missy” Cummings, a robotics expert and erstwhile elder safety authoritative at the National road Traffic Safety Administration. “He’s so desperate to effort to drive more money into this equation that he’s doing things like this [event].”

“The hardware needed”

I first started covering Tesla for The Verge in 2016, the same year that Musk made 1 of his first predictions about the imminent arrival of self-driving cars. “You’ll be able to summon your car from across the country,” he said, citing as an example a Tesla owner beckoning their vehicle to drive solo from fresh York to meet him in Los Angeles. The company went even further in a blog post, boasting that “all Tesla vehicles produced in our mill — including Model 3 — will have the hardware needed for full self-driving capability at a safety level substantially greater than a human driver.”

That post has since been deleted from Tesla’s site, along with the company’s first “Master Plan,” as Musk attempts to scrub Tesla’s past of all his overreaching pronouncements.

“Substantially greater than a human driver”

But more importantly, these kinds of statements fooled quite a few people into reasoning the shiny fresh electrical car in their driveway would have everything they needed to be full autonomous and that those futuristic capabilities were just around the corner. Elon Musk would flip the control and — presto — millions of cars would abruptly transform into robots. The media bought into it, portraying Tesla as being on the cusp of a historical evolution. And shortly enough, the company’s stock started reflecting this attitude, especially after Tesla defied expectations with the Model 3.

Of course, no of it was true. Nearly a decade later, no Tesla vehicle on the road present is autonomous. Sure, the company has rolled out a series of brashly branded driver-assist features — first Autopilot, then Navigate on Autopilot, then Full Self-Driving, and yet Full Self-Driving (Supervised) — but they do not enable the car to drive without constant human supervision.

You can’t sleep in your Tesla. You can’t summon it across town, let alone across the country. If you crash, you will be liable for what happens and who gets hurt. And if you effort to fight the company on any of that, you will most likely lose.

You can’t sleep in your Tesla

Even those Tesla owners lured into reasoning their vehicles were incognito robots would shortly realize the cost of the company’s obfuscations. In 2021, Tesla first started offering subscriptions to its long-awaited Full Self-Driving feature, including a $1,500 hardware upgrade for those early owners who were wrongly informed that their vehicle would have “the hardware needed” for full autonomy. (It was later lowered to $1,000 after client outcry.)

There are plenty of people utilizing Full Self-Driving (Supervised) present who will happily tell you how large it is and how they can’t imagine life without it. (Many besides have YouTube channels they want to promote.) They will besides argue over the semantics of autonomy. Shouldn’t something that controls the acceleration, braking, steering, and navigation besides get to be called autonomous?

In the absence of data from Tesla, it’s impossible to say how good or terrible FSD is with any certainty. Crowd-sourced projects like FSD Community Tracker are highly limited, only featuring data on a scant 200,000 miles of driving. Tesla says over 1 billion miles have been driven utilizing FSD. But even the tracker’s tiny snapshot of data shows 119 miles between critical disengagements. Waymo drove 17,000 miles between disengagements in 2023, according to the California DMV.

Illustration by Cath Virginia / The Verge | photograph by Grzegorz Wajda, Getty Images

While Tesla chased a much broader vision, Waymo leapt forward by realizing something more workable: remove the driver entirely and restrict the geography in which the vehicle can operate. Google, from which Waymo spun out in 2016, has long argued that advanced driver-assistance systems like Autopilot and FSD were inherently problematic. After all, human supervisors get bored and yet region out. The handoff between the vehicle and the driver can be fraught. It’s better to just cut the human out of the equation altogether.

Tesla is now latching onto Waymo’s better imagination in unveiling a full autonomous vehicle, the robotaxi. This is the vehicle that can silence all of those doubters. After all, Waymo doesn’t sale cars; it sells a service. Tesla sells cars. And wouldn’t it be infinitely cooler to own your own self-driving vehicle?

“You’re killing people”

Tesla likes to say that Autopilot — and later FSD — is saving lives. In fact, Musk has gone even further, declaring any criticism of its driver-assistance products amounts to murder. “You request to think carefully about this,” he said in 2016, “because if, in writing any article that’s negative, you effectively dissuade people from utilizing an autonomous vehicle, you’re killing people.”

At the same time, he said that Tesla had no plans to presume legal liability for crashes or deaths that occurred erstwhile Autopilot was in usage unless it was “something endemic to our design.”

Even in the annals of Musk quotes that have aged poorly, these rank up there. At the time, only 1 individual had died while utilizing Autopilot — a reflection, perhaps, of the tiny number of Tesla vehicles on the road. Now, there are over 2 million Teslas all over the globe and a substantially higher number of deaths.

“Something endemic to our design”

Presently, national regulators are investigating at least 1,000 individual Tesla crashes involving Autopilot and FSD. Of those crashes, at least 44 people died. Investigators found that Autopilot — and, in any cases, FSD — was not designed to keep the driver engaged in the task of driving. Drivers would become overly complacent and lose focus. And erstwhile it came time to react, it was besides late.

Tesla has pushed out many updates to FSD over the years, so it can be tough to pin down what precisely is incorrect with Tesla’s approach. Often, users flag a problem — the vehicle fails to admit certain signage or a circumstantial driving maneuver — and almost just as quickly, Tesla has an update available. That seems like a good thing — Tesla is responsive to problems and moves rapidly to fix them — until you remember that real people’s lives are at stake. And the pedestrians and cyclists outside the vehicle never consented to participating in this experimentation to teach cars to drive themselves.

Even the most fresh version of the FSD software has its faults. An independent investigation firm late tested versions 12.5.1 and 12.5.3 for over 1,000 miles and found it to be “surprisingly capable, while simultaneously problematic (and occasionally dangerously inept).” erstwhile errors occur, “they are occasionally sudden, dramatic, and dangerous.” In 1 instance, the group’s Tesla Model 3 ran a red light in the city during nighttime even though the cameras clearly detected the lights.

FSD is the foundation for the robotaxi. Everything has been leading up to this moment. But the strategy struggles with basic perception issues, like wet roads and sunlight glare. FSD struggles to admit motorcyclists: a 28-year-old motorcycle owner was killed outside of Seattle earlier this year by a Model S driver who was utilizing the driver-assist feature.

The strategy struggles with basic perception issues

Tesla utilized to print quarterly safety reports that it would claim proved that Autopilot was safer than regular human driving — but it then stopped abruptly in 2022. It started up again this year with a fresh study that says there is only 1 crash for all 6.88 million miles of Autopilot-assisted driving, versus 1 for all 1.45 million miles of non-Autopilot driving. That’s over 4 times safer than average human driving, according to Tesla.

This is the only safety data we have for Tesla’s driver-assist technology that is expected to be a precursor to the full autonomous robotaxi. But according to Noah Goodall, a civilian engineer who has published several peer-reviewed studies about Tesla Autopilot, the company’s safety reports neglect to take into account basic facts about traffic statistics, specified as that crashes are more common on city roads and undivided roads than on the highway, where Autopilot is most frequently used. And it led him to the conclusion that Tesla may be miscounting crashes in order to make Autopilot seem safer than it actually is.

“They fell apart beautiful quickly, erstwhile you dove in just a small bit,” Goodall told me. “I have problem publishing on this sometimes. Just due to the fact that the reviewers are like, ‘Everyone knows these are fake, why are you pointing this out?’”

Image: Cath Virginia / The Verge, Turbosquid

“A monumental effort”

If there’s 1 thing on which everyone can agree, it’s that Tesla has quite a few data. With nearly 5 million drivers on the road globally, each vehicle is sending immense amounts of information back to the mothership for processing and labeling. another companies, with only a fraction of the real-world miles, gotta usage simulated driving to fill in the gaps.

But the sheer volume of data that Tesla is processing is overwhelming. The company relies on a tiny army of data annotators who review thousands of hours of footage from Tesla owners and the company’s in-house test drivers. And according to Business Insider, those workers are pushed to decision rapidly through as many images and videos as they can or face disciplinary action. Accuracy is secondary to speed.

“It’s a monumental effort,” Cummings, the robotics expert, said. “People think Teslas are learning on the fly. They have no thought how incorrect they are, and just how much human preparation it takes to actually learn anything from the terabytes of data that are being gathered.”

Tesla’s approach to the hardware of driverless vehicles besides diverges from the remainder of the industry. Musk infamously relies on a camera-only approach, in contrast to the widely utilized practice of relying on a “fusion” of different sensors, including radar, ultrasonic, and lidar, to power autonomous driving. Musk calls lidar, in particular, a “crutch” and claims any company that relies on the laser sensor is “doomed.” Waymo’s robotaxis are adorned with large, apparent sensors, a kind expressly at odds with the sleekness of Musk’s vehicles.

Of course, Tesla does usage lidar on its test vehicles, but just to validate FSD. They won’t be going on any client cars, since lidar is inactive besides expensive. With its tens of thousands of laser points projecting a second, lidar provides a critical layer of redundancy for the vehicle as well as a way to visualize the planet in 3 dimensions.

The thought that you can introduce a full autonomous vehicle without the full suite of sensors that power all another AV on earth strains credulity for most experts on the technology.

“Why on earth would you want to tie 1 hand behind your back erstwhile you’re solving an almost impossible problem?” said Phil Koopman, an AV expert from Carnegie Mellon University. “And we know it’s going to be large bucks, so don’t skimp on the hardware.”

Image: Cath Virginia / The Verge, Turbosquid

High five

What is an autonomous car? It sounds like a simple question, but the answer is trickier than it seems. To aid clear things up, SAE International, a US organization that represents automotive engineers, created a six-step guide to automation. Intended for engineers alternatively than the general public, it ranged from Level 0, meaning no automation whatsoever, to Level 5, meaning the vehicle can drive itself anywhere at any time without any human intervention.

And there’s plenty of area for mistake and misunderstanding. A problem we’ve seen is what investigator Liza Dixon calls “autonowashing,” or any effort to overhype something as autonomous erstwhile it’s not.

Most experts dismiss Level 5 as pure discipline fiction. Waymo and others operate Level 4 vehicles, but very fewer people truly believe that Level 5 is attainable. Level 5 would require “an astronomical amount of technological development, maintenance, and testing,” Torc Robotics, a company developing self-driving trucks, says. Others call it a tube dream.

Except Musk. At a conference in Shanghai, Musk said with ultimate assurance that the company “will have the basic functionality for Level 5 autonomy complete this year.” That was in July 2020.

He’ll likely effort to pass off the Tesla robotaxi as the endpoint of this achievement, the vehicle that will aid usher in this wildly implausible goal. And it’s crucial to see through the bluster and bullshit and measurement it against what he’s promised in the past and besides what another players have already achieved.

Tesla’s past is littered with fanciful ideas that never panned out — like a solar-powered Supercharger network, battery swapping, or robotic snake-style chargers. But Musk never bet his full company, his reputation, and most importantly, his net worth, on those projects. This 1 is different. And shortly enough, we’ll know whether the Tesla robotaxi is the exception to the regulation or just another guy dancing in a robot costume.





Source link

Idź do oryginalnego materiału