PLEASE NOTE :: We are still open for business and accepting new clients. To protect your safety in response to the threats of COVID-19, we are offering new and current clients the ability to meet with us in person, via telephone or through video conferencing. Please call our office to discuss your options.

Alexander Law

Call Or Text For A Free Case Evaluation

The recent fatal crash involving a Tesla Model S and a tractor-trailer in Williston, Florida has consumer advocates worried. The company has aggressively deployed and marketed their so-called Autopilot driving-assist system in what they’re describing as a “beta test.” Tesla’s Autopilot technology involves multiple systems including automatic steering, cameras, radar, electronic sensors and other data. While this self-driving technology has many consumers excited over the prospect of advanced autonomous driving technology, industry leaders, public safety agencies and consumer advocates still have questions.

Aviation Industry: Attentive Pilots and Autopilot

When we think of autopilot systems, the commercial aviation industry immediately comes to mind. Aviation autopilot systems are designed to signal a pilot in the event of a potential emergency. The pilot can then take over and skillfully intervene in the situation. Even when using the autopilot, pilots remain attentive to the task at hand. In 2009, Chesley Sullenberger’s famously intervened on a US Airways flight, landing tail first into the Hudson River, saving a plane full of passengers. Autopilots are of no use when the pilot is totally removed from hands-on control. We saw that clearly in the Asiana crash in 2013 where multiple pilots operating outside the parameters of the safety system ignored gross errors that caused the plane to hit the seawall at San Francisco International Airport.

Tesla’s “Autopilot” Marketing Message

Perhaps most perilously, Tesla’s marketing of their so-called Autopilot feature has given consumers a false sense of security. Although Tesla believes that a self-driving car on autopilot is achievable, that doesn’t mean that it is fully reliable. The concept of decreasing the workload on a pilot by an autopilot makes sense but it must be combined with warning system to assure the full attention and optimal reaction time of a pilot and under no circumstances do we want a pilot to become a backseat driver. This is new technology in infancy and releasing it to the public without fully testing it is foolish. It will take decades before this technology is reliable in real life situations such as a tractor-trailer truck turning left in front of a car.

Guidance Isn’t Good Enough

At this time, consumer groups are pressing the Obama administration not to issue guidance on autonomous vehicle operations next week when federal transportation officials address an industry conference in San Francisco. With evolving and emerging autonomous vehicle technology, guidance simply isn’t going to cut it. We need real legislation and rulemaking that puts enforceable standards in place. While other companies are researching autonomous driving technology, Tesla is the only automaker that allows drivers to have their hands off of the steering wheel for minutes at a time. Driving requires skillful hand eye coordination and attention to the changing circumstances of road surface, traffic and the universal perils that automation cannot yet address.

“User Error” Vs. “Use Error”

Tesla’s technology is welcome, providing we keep in mind that the field human factors engineering has devoted itself to addressing the automation of complex safety systems, first identified by William Hyman, professor emeritus of engineering at Texas A&M University, who coined the distinction between “user error” and “use error” in a system. We know that to assure safety, a system must contain imbedded warnings to take intuitive action, consistent with the real world and with operator’s skill. Tesla’s claim that the Florida crash and death of the driver was due to user error begs the question. It is the common defense by manufacturers in defective product lawsuits. The defense of “user error” must not be confused with “use error” built into a system that creates conditions for avoidable errors, events, injuries and deaths to occur.

The Complexities of Motor Vehicle Operation

Years of representing survivors with traumatic brain injury and learning from neurosurgeons, clinical psychologists, occupational speech and physical therapists have taught me to appreciate the sophistication of the human brain and the ability to utilize technology to increase performance. Driving is incredibly complex although we do it every day without much thought of the actual skills required for reliable perception and reaction. The tremendous increase in collisions injuries and deaths caused by making calls and texting on mobile devices proves the necessity for full attention in driving.