Ford to take driver out of loop on path to full autonomy
Ford is to skip plans to introduce partially autonomous, Level 3 cars due to concerns over driver safety.
As reported by Bloomberg, the carmaker’s tests have found that engineers testing the semi-autonomous cars – and required to step in if needed – have actually fallen asleep at the wheel. This is despite the tests deploying stimuli such as buzzers and vibrating seats.
As such, Ford has decided not to introduce Level 3 cars and will only produce Level 5, or full autonomy cars, which are expected to debut in 2021.
Speaking to Bloomberg, Raj Nair, Ford’s product development chief, said: “These are trained engineers who are there to observe what’s happening. But it’s human nature that you start trusting the vehicle more and more and that you feel you don’t need to be paying attention.”
The carmaker’s strategy differs from carmakers including BMW, Mercedes-Benz and Volkwagen who plan to introduce Level 3 ‘conditional’ driving vehicles.
Stan Boland, CEO of UK-based autonomous vehicle start-up, FiveAI, commented: “An intrinsic issue with driver assistance technologies such as Ford’s that are only Level 2 or Level 3 autonomy, and that automate the driving task to a large degree, is that drivers quickly build up overconfidence in the systems’ abilities and begin to disengage from providing continual oversight. This can be very dangerous for all road users, or even fatal in some cases.
“At Five AI, we believe the safest approach is to avoid sharing any element of the driving task with humans: our system will be capable of handling any situation that it comes across without any human intervention. This is described as Level 5 autonomy, it’s much harder than any other level but we believe it’s ultimately safer and is the key to enabling the shift to consumption of Mobility-as-a-Service (MaaS), with all the cost, convenience and environmental benefits that will bring.”
Tesla’s Autopilot semi-autonomous driving system was recently cleared of defects by the US Department of Transportation’s National Highway Traffic Safety Administration following a seven-month investigation following a crash.
The crash saw a Tesla Model S hit by an articulated lorry while using Autopilot autonomous driving feature in beta mode.
At the time of the crash, the Model S was driving on a divided highway when the articulated lorry drove across the highway perpendicular to the Model S. In a statement the carmaker said that neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. As a result the Model S passed under the trailer.
In its report, the NHTSA said such systems “require the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes”.