In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.
The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.
I hope some of you actually skimmed the article and got to the “disengaging” part.
As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.
It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.
Don’t get me wrong, autopilot turning itself off right before a crash is sus and I wouldn’t put it past Tesla to do something like that (I mean come on, why don’t they use lidar) but maybe it’s so the car doesn’t try to power the wheels or something after impact which could potentially worsen the event.
On the other hand, they’re POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.
Rober seems to think so, since he says in the video that it’s likely disengaging because the parking sensors detect that it’s parked because of the object in front, and it shuts off the cruise control.
I see your point, and it makes sense, but I would be very surprised if Tesla did this. I think the best option would be to turn off the features once an impact is detected. It shutting off before hand feels like a cheap ploy to avoid guilt
Wouldn’t it make more sense for autopilot to brake and try to stop the car instead of just turning off and letting the car roll? If it’s certain enough that there will be an accident, just applying the brakes until there’s user override would make much more sense…
Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.
Yeah but that’s milliseconds. Ergo, the crash was already going to happen.
In any case, the problem with Tesla autopilot is that it doesn’t have radar. It can’t see objects and there have been many instances where a Tesla crashed into a large visible object.
That’s what’s confusing me. Rober’s hypothesis is without lidar the Tesla couldn’t detect the wall. But to claim that autopilot shut itself off before impact means that the Tesla detected the wall and decided impact was imminent, which disproves his point.
If you watch the in car footage, autopilot is on for all of three seconds and by the time its on impact was already going to happen. That said, teslas should have lidar and probably do something other than disengage before hitting the wall but I suspect their cameras were good enough to detect the wall through lack of parallax or something like that.
I’ve heard that too, and I don’t doubt it, but watching Mark Rober’s video, it seems like he’s deathgripping the wheel pretty hard before the impact which seems more likely to be disengaging. Each time, you can see the wheel tug slightly to the left, but his deathgrip pulls it back to the right.
E. Lon Musk. Supah. Geenius.
MEEP MEEP
My 500$ robot vacuum has LiDAR, meanwhile these 50k pieces of shit don’t 😂
Holy shit, I knew I’d heard this word before. My Chinese robot vacuum cleaner has more technology than a tesla hahahahaha
Vacuum doesn’t run outdoors and accidentally running into a wall doesn’t generate lawsuits.
But, yes, any self-driving cars should absolutely be required to have lidar. I don’t think you could find any professional in the field that would argue that lidar is the proper tool for this.
…what is your point here, exactly? The stakes might be lower for a vacuum cleaner, sure, but lidar - or a similar time-of-flight system - is the only consistent way of mapping environmental geometry. It doesn’t matter if that’s a dining room full of tables and chairs, or a pedestrian crossing full of children.
I think you’re suffering from not knowing what you don’t know.
Let me make it a but clearer for you to make a fair answer.
Take a .25mw lidar sensor off a vacuum, take it outdoors and scan an intersection.
Will that laser be visible to the sensor?
is it spinning fast enough to track a kid moving in to an intersection when you’re traveling at 73 feet per second?
You’re mischaracterizing their point. Nobody is saying take the exact piece of equipment, put it in the vehicle and PRESTO. That’d be like asking why the vacuum battery can’t power the car. Because duh.
The point is if such a novelty, inconsequential item that doesn’t have any kind of life safety requirements can employ a class of technology that would prevent adverse effects, why the fuck doesn’t the vehicle? This is a design flaw of Teslas, pure and simple.
But they do, there are literally cars out there with lidar sensors.
The question was why can’t I have a lidar sensor on my car if my $150 vacuum has one. The lidar sensor for a car is more than $150.
You don’t have one because there are expensive at that size and update frequency. Sensors that are capable of outdoor mapping at high speed cost the price of a small car.
The manufacturers suspect and probably rightfully so that people don’t want to pay an extra 10 - 30 grand for an array of sensors.
The technology readily exists rober had one in his video that he used to scan a roller coaster. It’s not some conspiracy that you don’t have it on cars and it’s not like it’s not capable of being done because waymo does it all the time.
There’s a reason why waymo doesn’t use smaller sensors they use the minimum of what works well. Which is expensive, which people looking at a mid-range car don’t want to take on the extra cost, hence it’s not available
Only Tesla does not use radar with their control systems. Every single other manufacturer uses radar control mixed with the camera system. The Tesla system is garbage.
The self driving system uber was working on also went downhill after they went full visual only.
yeah, you’d think they’d at least use radar. That’s cheap AF. It’s like someone there said I have this hill to die on, I bet we can do it all with cameras.
10 - 30 grand
Decent LIDAR sensors have gotten a lot cheaper in the last 5 years or so, here’s one that is used in commercial self-driving taxis: https://www.alibaba.com/product-detail/X01-36020021-Nev-Auto-Parts-for_1601252480285.html
Shit that’s pretty decent. That looks like a ready fit car part, I wonder what vehicle it’s for. Kind of sucks that it only faces One direction but at that price four them would not be a big deal
I think you’re suffering from not knowing what you don’t know.
and I think you’re suffering from being an arrogant sack of dicks who doesn’t like being called out on their poor communication skills and, through either a lack of self-awareness or an unwarranted overabundance of self-confidence, projects their own flaws on others. But for the more receptive types who want to learn more, here’s Syed Saad ul Hassan’s very well-written 2022 paper on practical applications, titled Lidar Sensor in Autonomous Vehicles which I found also serves as neat primer of lidar in general..
Well look at you being adult and using big words instead of just insulting people. Not even going to wastime on people like you, I’m going to block you and move on and hope that everyone else does the same so you can sit in your own quiet little world wondering why no one likes you.
You’re an idiot.
jesus man, how many alts do you have?
Notice how they’re mad at the video and not the car, manufacturer, or the CEO. It’s a huge safety issue yet they’d rather defend a brand that obviously doesn’t even care about their safety. Like, nobody is gonna give you a medal for being loyal to a brand.
These people haven’t found any individual self identity.
An attack on the brand is an attack on them. Reminds me of the people who made Stars Wars their meaning and crumbled when a certain trilogy didn’t hold up.
An attack on the brand is an attack on them.
Thus it ever is with Conservatives. They make $whatever their whole identity, and so take any critique of $whatever as a personal attack against themselves.
I blame evangelical religions’ need for martyrdom for this.
“Mark my word, if and when these preachers get control of the [Republican] party, and they’re sure trying to do so, it’s going to be a terrible damn problem. Frankly, these people frighten me. Politics and governing demand compromise. But these Christians believe they are acting in the name of God, so they can’t and won’t compromise. I know, I’ve tried to deal with them.” ― Barry Goldwater
You pretty much hit the nail on the head. These people have no identity or ability to think for themselves because they never needed either one. The church will do all your thinking for you, and anything it doesn’t cover will be handled by Fox News. Be like everyone else and fit in, otherwise… you have to start thinking for yourself. THE HORROR.
Nice variable.
Always be wary of people who are angered by facts.
Kinda depends on the fact, right? Plenty of factual things piss me off, but I’d argue I’m correct to be pissed off about them.
The styrofoam wall had a pre-cut hole to weaken it, and some people are using it as a gotcha proving the video was faked. It would be funny if it wasn’t so pathetic.
Yeah, but it’s styrofoam. You could literally run through it. And I’m sure they did that more as a safety measure so that it was guaranteed to collapse so nobody would be injured.
But at the same time it still drove through a fucking wall. The integrity doesn’t mean shit because it drove through a literal fucking wall.
To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn’t rely on cameras
Edit: having just watched the video, that was a very obvious fake wall. You can see the outlines of it pretty well. I’m also surprised it failed other tests when not on autopilot, seems pretty fucking dangerous.
To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.
this isn’t being fair. It’s being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.
Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.
The video does bring up human ability too with the fog test (“Optically, with my own eyes, I can no longer see there’s a kid through this fog. The lidar has no issue.”) But, as they show, this wall is extremely obvious to the driver.
The tesla would lose its shit if it sees this
They already have trouble enough with trucks carrying traffic lights, or with speed limit drivers on them.
I’d take that bet. I imagine at least some drivers would notice something sus’ (due to depth perception, which should be striking as you get close, or lack of ANY movement or some kind of reflection) and either
- slow down
- use a trick, e.g. flicking lights or driving a bit to the sides and back, to try to see what’s off
or probably both, but anyway as other already said, it’s being compared to other autopilot systems, not human drivers.
It’s a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.
So, who’s the YouTuber that’s gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.
It basically already happened in the Mark Rober video, it turns off by itself less than a second before hitting
I wondered how the hell it managed to fool LIDAR, well…
The stunt was meant to demonstrate the shortcomings of relying entirely on cameras — rather than the LIDAR and radar systems used by brands and autonomous vehicle makers other than Tesla.
They used to have it but Elmo removed it years ago as a cost cutting move.
Now they’re the only self driving car that drives into immovable objects.
You might remember a few years ago a guy got decapitated when his Model S drove straight into the side of a semi trailer.
The tl;dr here is that Elon said that humans have eyes and they work, and eyes are like cameras, so use cameras instead of expensive LIDAR. Dick fully inside car door for the slam.
If I could pass one law, requiring multiple redundant scanning tech on anything autonomous large enough to hurt me might be it.
I occasionally go to our warehouses which have robotic arms, autonomous fork lifts, etc. All of those have far more saftey features than a self driving Tesla, and they aren’t in public.
I bet the reason why he does not want the LiDAR in the car really cause it looks ugly aestheticly.
It costs too much. It’s also why you have to worry about panels falling off the swastitruck if you park next to them. They also apparently lack any sort of rollover frame.
He doesn’t want to pay for anything, including NHTSB crash tests.
It’s literally what Drumpf would have created if he owned a car company. Cut all costs, disregard all regulations, and make the public the alpha testers.
it did cost too much at the time, but currently he doesnt want to do it because he would have to admit hes wrong.
The panels are glued on. The glue fails when the temperature changes.
I can’t believe that this car is legal to drive in public.
Right? It’s also got a cast aluminum frame that breaks if you load the trailer hitch with around 10,000 lbs of downward force. Which means that the back of your Cybertruck could just straight up break off if you’ve frontloaded your trailer and hit a pothole wrong.
The guy bankrupted a casino, not by playing against it and being super lucky, but by owning it. Virtually everything he has ever touched in business has turned to shit. How do you ever in the living fuck screwup stakes at Costco? My cousin with my be good eye and a working elbow could do it.
And now its the country’s second try. This time unhinged, with all the training wheels off. The guy is stepping on the pedal while stripping the car for parts and giving away the fuel. The guy doesn’t even drive, he just fired the chauffeur and is dismantling the car from the inside with a shot gun…full steam ahead on to a nice brick wall and an infinity cliff ready to take us all with him. And Canada and Mexico and Gina. Three and three quarters of a year more of daily atrocities and law breakage. At least Hitler boy brought back the astronauts.
What would definitely help with the discussion is if Mark Rober the scientist left a fucking crumb of scientific approach in his video. He didn’t really explain how he was testing it just slam car into things for views. This and a collaboration with a company that makes lidar made the video open to every possible criticism and it’s a shame.
Discovery channel level of dumbed down „science”.
Found the Tesla owner!
😋
I fucking hate tesla and elon musk. Also I fucking hate people calling unverifiable shit science
As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.
This has been known.
They do it so they can evade liability for the crash.
Any crash within 10s of a disengagement counts as it being on so you can’t just do this.
Edit: added the time unit.
Edit2: it’s actually 30s not 10s. See below.
10n what
Oops haha, 10 seconds.
Where are you seeing that?
There’s nothing I’m seeing as a matter of law or regulation.
In any case liability (especially civil liability) is an absolute bitch. It’s incredibly messy and likely will not every be so cut and dry.
Well it’s not that it was a crash caused by a level 2 system, but that they’ll investigate it.
So you can’t hide the crash by disengaging it just before.
Looks like it’s actually 30s seconds not 10s, or maybe it was 10s once upon a time and they changed it to 30?
The General Order requires that reporting entities file incident reports for crashes involving ADS-equipped vehicles that occur on publicly accessible roads in the United States and its territories. Crashes involving an ADS-equipped vehicle are reportable if the ADS was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury
https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf
Thanks for that.
The thing is, though the NHTSA generally doesn’t make a determination on criminal or civil liability. They’ll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It’s just saying “when your car crashes, you need to tell us about it.” and they kinda assume they comply.
Which, Tesla doesn’t want to comply, and is one of the reasons Musk/DOGE is going after them.
I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.
Not sure how that helps in evading liability.
Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn’t check the recording framerate, but 25fps is the slowest reasonable), less than a second.
They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.
And then that creates a discussion about how much time the human driver has to have in order to actually solve the problem, or gray areas about who exactly controls what when, and it complicates the situation enough where maybe Tesla can pay less money for the deaths that they are obviously responsible for.
They’re talking about avoiding legal liability, not about actually doing the right thing. And of course you can see how it would help them avoid legal liability. The lawyers will walk into court and honestly say that at the time of the accident the human driver was in control of the vehicle.
The plaintiff’s lawyers would say, the autopilot was engaged, made the decision to run into the wall, and turned off 0.1 seconds before impact. Liability is not going disappear when there were 4.9 seconds of making dangerous decisions and peacing out in the last 0.1.
It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.
Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.
If it knows it’s about to crash, then why not just brake?
AEB braking was originally designed to not prevent a crash, but to slow the car when a unavoidable crash was detected.
It’s since gotten better and can also prevent crashes now, but slowing the speed of the crash was the original important piece. It’s a lot easier to predict an unavoidable crash, than to detect a potential crash and stop in time.
Insurance companies offer a discount for having any type of AEB as even just slowing will reduce damages and their cost out of pocket.
Not all AEB systems are created equal though.
Maybe disengaging AP if an unavoidable crash is detected triggers the AEB system? Like maybe for AEB to take over which should always be running, AP has to be off?
That makes so little sense… It detects it’s about to crash then gives up and lets you sort it?
That’s like the opposite of my Audi who does detect I’m about to hit something and gives me either a warning or just actively hits the brakes if I don’t have time to handle it.
If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.The self-driving equivalent of “Jesus take the wheel!”
Looney Tunes shit.
And the president is driving one of these?
Maybe we should be purchasing lots of paint and cement blockades…
When he was in the Tesla asking if he should go for a ride I was screaming “Yes! Yes Mr. President! Please! Elon, show him full self driving on the interstate! Show him full self driving mode!”
The president can’t drive by law unless on the grounds of the White House and maybe Camp David. At least while in office. They might be allowed to drive after leaving office…
I don’t think Trump can drive. As in, he doesn’t even know what the pedals do.
The real question is, in a truly self-driving car, (not a tesla) are you actually driving?
Why would a car that expensive not have a LiDAR sensor?
Cameras are cheaper…that’s it
Read about this somewhere. Iirc, Elon felt cameras were better than LiDAR at a time when that was kinda true, but the technology improved considerably in the interim and he pridefully refuses to admit he needs to adapt. [Edit: I had hastily read the referenced article and am incorrect here; link to accurate statements is linked in a reply below.]
He didn’t think they were better. He thought Tesla could get away without the more expensive lidar. Basically “humans can drive with just vision, that should be enough for an autonomous vehicle also.” Basically he did it because lidar is more expensive.
I didn’t think it was about the cost. I think he just likes to be contrarian because he thinks it makes him seem smart. He then needs to stick by his stupid decisions.
I’m assuming it’s a cost because it makes sense to me. His goal was to build full-self-driving (FSD) into ever car and sell the service as a subscription.
If you add another $500 in components then that’s a lot of cost (probably a lot cheaper today but this was 10 years ago). Cameras are cheap and can be spread around the car with additional non-FSD benefits where as lidar has much fewer uses when the cost is not covered. I think he used his “first-principles” argument as a justification to the engineers as another way for him to say “I don’t want to pay for lidar, make it work with the cheap cameras.”
Why else would management take off the table an obviously extremely useful safety tool?
Why else would management take off the table an obviously extremely useful safety tool?
What makes you think people make rational decisions? Especially sociopaths like Musk?
Because Musk insists that cameras are better and that LiDAR is flawed
That’s not really true.
He use lidar in SpaceX because he knows it’s the right tool for their specific job.
His stance is it’s not that cameras are better, but that cameras have to be so good for a truly AV that putting effort into both means you’re not going to make your cameras good enough to do it and rely on lidar instead. That and cost.
If the car can’t process and understand the world via cameras, it’s doomed to fail at a mass scale anyway.
It might be a wrong stance, but it’s not that lidar is flawed.
Tesla even uses lidar to ground truth their cameras
Edit: just adding a late example - Waymo, Cruise, and probably everyone out there still use humans to tell the car what to do if it gets stuck. I even bet Tesla will if they ever launch a robotaxi as they need a way to somehow help the car if it gets stuck. When we see these failures with Waymo and Cruise, it’s less “is something there” and more “I don’t understand this situation”. The understanding comes from vision. Lidar just gives the something is there, but it isn’t solving their problem.
I think the bigger issue is that he is saying redundancy is not important. He thinks cameras could be good enough, well fine, but the failure results in loss of life so build in redundancy: lidar, radar, anything to failover. The fact that cutting costs OR having a belief that one system is good enough is despicable.
Because Tesla makes money, with the byproduct of cars.
There was a comedy channel on Youtube aeons ago that would do “if x were honest” videos. Their slogan for Valve was “We used to make games. Now we make money.”
Honest Ads is still around, they’ve just moved off the Cracked channel like how PitchMeetings moved off the ScreenRant channel.
It wasn’t Cracked, it was a channel called Gaming Wildlife, last video on the channel was posted 6 years ago; I think they’re defunct. here’s the video in question.
Because commonly they use radar instead, the modern sensors that are also used for adaptive cruise control even have heaters to defrost the sensor housing in winter
Painted wall? That’s high tech shit.
I got a Tesla from my work before Elon went full Reich 3, and try this:
- break on bridge shadows on the highway
- start wipers on shadows, but not on rain
- break on cars parked on the roadside if there’s a bend in the road
- disengage autopilot and break when driving towards the sun
- change set speed at highway crossings because fuck the guy behind me, right?
- engage emergency break if a bike waits to cross at the side of the road
To which I’ll add:
- moldy frunk (short for fucking trunk, I guess?), no ventilation whatsoever, water comes in, water stays in
- pay attention noises for fuck-all reasons masking my podcasts and forcing me to rewind
- the fucking cabin camera nanny - which I admittedly disabled with some chewing gum
- the worst mp3 player known to man, the original Winamp was light years ahead - won’t index, won’t search, will reload USB and lose its place with almost every car start
- bonkers UI with no integration with Android or Apple - I’m playing podcasts via low rate Bluetooth codecs, at least it doesn’t matter much for voice
- unusable airco in auto mode, insists on blowing cold air in your face
Say what you want about European cars, at least they got usability and integration right. As did most of the auto industry. Fuck Tesla, never again. Bunch of Steve Jobs wannabes.
“Dipshit Nazis mad at facts bursting their bubble is unreality” is another way of reading this headline.
I believe the outrage is that the video showed that autopilot was off when they crashed into the wall. That’s what the red circle in the thumbnail is highlighting. The whole thing apparently being a setup for views like Top Gear faking the Model S breaking down.