The video starts with a rant against The Dawn Project and its founder, Dan O’Dowd, which ran a full page advertisement in the New York Times arguing “Tesla Full-Self Driving must be removed from our roads until it has 1000 times fewer critical malfunctions” and “…offering [US]$10,000 to the first person who can name another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes.”
The actual driving video starts at the 2:34 mark.
I only watched about half the drive. Based on that, here are a couple of observations:
The autonomous system is overly cautious. So much so that it would have interfered with the normal flow of traffic. It even earned a honk from a human driver. A bunch of these cars might cause (even more) massive traffic jams.
It makes plenty of errors — mostly on the side of caution but errors nevertheless. It has the driving skills of a teenager with a learner’s permit.
There is a tension between safety and efficiency. It may be true that the autonomous system would be as safe as, or possibly safer than, a good human driver but there is a tradeoff: it’s slower. An equivalent way to improve auto safety would be to lower the speed limit everywhere to 25 mph. Then we could save even more lives! This is not a tradeoff anyone is interested in making.
One thing that has not been tested in self-driving cars is how a whole bunch of them together would perform, while still in the presence of pedestrians, cyclists, and other hazards. I could easily imagine the small errors of each combining to produce gridlock or accidents.
The main thing I learned from the video is that the system is not able to adapt to unexpected situations. They still have a long way to go before they can be truly autonomous.
Particularly when the government gets involved.
Tesla’s “assertive mode” busted by NHTSA. “sheep mode” and 'lemming mode" still under review.
I’d wondered about the “rolling stops” I saw it doing in the videos. That was flat illegal everywhere I took a driver’s test (despite seeing everybody do it all the time on the road), but I didn’t know if the rules had been changed.
Of course, once all cars have Tesla-type vision systems and links back to headquarters, they’ll report such behaviour to the Man so offenders can be fined and have the Social Credit scores decremented.
Rolling stops remain illegal. In fact, a cop pulled me over for a rolling stop on my bicycle. Got off with a warning.
Unlike cars, bikes fall over when they come to a compete stop. The time required to bring a bike to a complete stop includes unclipping, dismounting, remounting, clipping back in, then finally accelerating out of the way of impatient car drivers. In the cycling community it is well understood that the Idaho Stop is by far the most efficient way and often the safest way to navigate rural intersections on a bicycle.
Massachusetts requires open access telematics, causing manufacturers to delete the feature:
Just a thought, it seems AI already beats intoxicated drivers.
I would not be surprised to see a breathalizer built into future tech, that overides manual control, and this might just be the wedge to get humans to give control of driving to tech.
But there will always be excellent drivers that do not drink and drive – and are willing and able to rewire/program their devices to evade Big Brother.
In the not too distant AI future, it may be that the best and brightest are outlaws.
That’s not going to buff out: