The whole problem is (as mentioned elsewhere below this article) 99.99999% of drivers in the world are not experienced with car testing or beta software, let alone the mix. It’s ridiculous that Tesla would allow the average driver to do that.
Other manufacturers are keen to roll out new functions and climb slowly from beeps and blinking lights to a little help with steering, acceleration and/or braking to the possibility of partially automatic traffic jams. Tesla immediately switches to the theoretical function of getting from A to B without having to do anything as a driver and then complaining that he’s been hindered.
They just can’t do what Tesla wants safely enough. Just look on YouTube at how many videos there are of a Tesla autopilot missing (temporary) signs on the road or not seeing that the car is stationary. Not to mention the pesky drivers we’d expect would do weird things and walk away from the unseen Tesla autopilot, and if that driver suddenly hits the brakes or suddenly changes lanes, the autopilot goes crazy. It is very good that the reaction at that moment is faster than a human can, but an average driver will not even get into these situations.
So the tricky part is that Tesla’s autopilot promises a lot but isn’t reliable enough in exceptional situations where reliability can make a very big difference. It makes perfect sense to me that Tesla is not allowed to do what it wants here in Europe, because that is simply not safe.
All other manufacturers are subject to the same rules, but they show that the limited functions that they provide work correctly, not suddenly interrupted from one update to another. If Tesla can also prove it, don’t worry.
[Reactie gewijzigd door Oon op 21 februari 2022 11:32]
“Total coffee specialist. Hardcore reader. Incurable music scholar. Web guru. Freelance troublemaker. Problem solver. Travel trailblazer.”