Self-driving startup TuSimple is reportedly under investigation after trying to blame a self-driving truck crash as a result of “human error”. Autonomous vehicle researchers at Carnegie Mellon University say that blaming humans is misleading, and that joint safeguards would have prevented the accident.
In April, an independently driven semi-trailer truck equipped with TuSimple technology was driving on a highway in Tucson, Arizona, when it suddenly swerved left and hit a concrete barrier, according to a dashcam clip leaked to YouTube.
TuSimple blamed the crash on “human error”, but an internal report review by The Wall Street Journal He points out that fixing avalanche on a human is an oversimplification.
According to the internal report, the accident occurred because “someone in the cabin did not restart the autonomous driving system properly before it was engaged, causing an old command to be executed.” The Wall Street Journal reports.
Essentially, the left turn command was 2.5 minutes old, and it should have been cleared but it wasn’t.
But self-driving vehicle researchers at Carnegie Mellon say that blaming humans is misleading, and that joint safeguards would have prevented the accident.
The researchers said The Wall Street Journal That a truck should not respond to commands that are even a few hundredths of a second old, and that the system should never allow a self-driving truck to turn sharply while traveling at 65 mph.
Phil Koopman, an assistant professor at Carnegie Mellon University, told magazine.
On Tuesday, TuSimple said in Blog post“We take our responsibility to find and resolve all safety issues very seriously,” adding that it responded to the April incident “on the ground immediately.”[ing] Our entire fleet is independent and we have launched an independent review to determine the cause of the accident.”
“With what we learned from this review in hand, we have upgraded all of our systems with new automated system checks to prevent this type of human error from happening again and have reported the incident to NHTSA and the Arizona Department of Transportation,” the company added. “
However, the National Highway Traffic Safety Administration (NHTSA) is joining the Federal Motor Carrier Safety Administration (FMCSA) in investigating a San Diego-based company.
The FMCSA said in a letter that it had begun “achieve safety compliance” at TuSimple — a reference to the April incident.
TuSimple isn’t the only self-driving vehicle company under NTSA investigation.
The federal agency has launched an investigation into another fatal car crash involving Tesla’s “fully self-driving” autopilot system. The latest Tesla car crash under federal investigation resulted in three deaths.
In June, a federal investigation of Tesla’s autopilot function escalated, with NHTSA now investigating whether the autopilot feature was defective. The agency studies data on 200 Tesla accidents, noting that “on average in these accidents, the autopilot aborted control of the vehicle less than one second before the first collision.”