A widely published dig against Tesla's
Autopilot self-driving technology, by a Volvo engineer, is not a swipe
at the company itself, but at our entire understanding of what
autonomous driving should be.
Trent Victor, senior technical leader of crash avoidance for Volvo, said during an interview with The Verge
that what Tesla is claiming to be a game-changer is actually a simple
semi-autonomous technology which looks far more advanced than it really
is. Victor said he fears Autopilot "gives you the impression that it's
doing more than it is...[Autopilot] is more of an unsupervised wannabe."
But Victor has only singled out
Autopilot because it is the most talked-about, the most impressive and,
seemingly, the most advanced system on the road today. Broadly speaking,
Tesla is using the same technology (provided by Israeli company
MobilEye) as many other car companies, including BMW and Volkswagen
Group.
Tesla might be pushing the technology much
harder than its rivals and achieving more impressive results, but
Autopilot still falls into the same vague category of 'level three
autonomy' where it isn't clear if the driver needs to be fully alert or
not, and where the system can stop working at any moment.
First, a little explanation of how autonomous driving is categorised:
Back to that Volvo/Tesla spat:
Volvo's Drive Me autonomous car experiments
will kick off next year at level four. Its cars will be capable of
driving themselves all of the time (on motorways) and handle any
situation presented to them without human intervention; even if
something goes wrong and it does not know what to do next, the car is
programmed to safely come to a stop at the side of the road.
Victor added: "In our concept, if you don't
take over, if you have fallen asleep or are watching a film, then we
will take responsibility still. We won't just turn [the self-driving
system] off [as Autopilot does]. We take responsibility and we'll be
stopping the vehicle if you don't take over."
Volvo had previously said it will take full responsibility for all accidents caused by its autonomous vehicles.
Victor added that it is "also programming it for extreme events like
people walking in the road even where they're not supposed to be.
There's a massive amount of work put into making it handle a crash or
conflict situations."
Blame the game, not the players
Although Victor has singled out Tesla and
Autopilot here, it seems his argument is against level three autonomy in
general, and how it blurs the lines between a driver needing to pay
attention and the car doing everything for them.
Level two and three systems, like Autopilot,
appear to be more in control (and seem more autonomous and therefore
trustworthy) than they actually are. IBTimes UK recently compared Autopilot to BMW's own system
in a new 7-Series automobile and, despite using broadly the same
technology, the BMW system felt more conservative, while Autopilot
pushed the technology much harder to achieve a seemingly more automated
experience.
Tesla says Autopilot is still in the 'beta'
development stage and is constantly learning as it is used. The system
will no doubt improve and offer a very different experience once the
Model 3 goes on sale at the end of 2017. By then, we also expect to see
other carmakers pushing into level four territory. But for the next 12
months, they need to push through the ambiguous level three as quickly
(and safely) as possible.
Post a Comment