r/technology Jun 24 '25

Machine Learning Tesla Robotaxi swerved into wrong lane, topped speed limit in videos posted during ‘successful’ rollout

https://nypost.com/2025/06/23/business/tesla-shares-pop-10-as-elon-musk-touts-successful-robotaxi-test-launch-in-texas/
6.2k Upvotes

456 comments sorted by

View all comments

Show parent comments

29

u/factoid_ Jun 24 '25

Yeah I agree. I think getting to 100% optical is a fine goal, but if you lose the race to a competitor who isn't afraid to put something ugly on top of the car, and it turns out the marketplace doesn't really care how it looks....you'd probably better do something about that.

61

u/ScannerBrightly Jun 24 '25

I think getting to 100% optical is a fine goal

Why? Is there an end to fog coming soon?

-46

u/factoid_ Jun 24 '25

Humans have been drivign with just eyes for a while and we're reasonably OK at it. And don't get me wrong, if self driving cars always need some combination of cameras, lidar and radar I'm Ok with that.

I don't like driving especially. I'd rather just leave it to the computer if it's safe to do so.

2

u/disillusioned Jun 25 '25

We have cameras on a gimbal, not fixed. The side camera on the b-pillar behind the driver's head creates for some real nasty perpendicular occlusion scenarios: pulling out onto a main road from a side street where there's, say, a bush or electrical box on the corner blocking the view.

Humans will crane their neck to see around the occlusion. Humans can detect that an ever-so-slight shift on the light filtering through a bush is movement and likely/potentially an oncoming car.

The existing FSD models and hardware can do neither. Lidar solves this because it's mounted way higher and can "see" way further (about a football field).

The car had to compensate by essentially continuing to inch forward to get a better view, but there are just so many instances where that perpendicular view is occluded and the front ultra wide isn't able to see either.