Serving Maryland and Washington D.C. 301-870-1200

Waldorf, MD Auto Accidents: Can I Sue if I am Injured By a Self-Driving Vehicle?

Yes, absolutely. If you have been the innocent victim of a Waldorf, Maryland accident caused by a self-driving or autonomous vehicle, you can sue for compensation and/or make an insurance claim. If this happens to you, call us here at the Law Office of Robert Castro. Our number is (301) 705-5137. You will need deeply experienced Maryland personal injury attorneys because these types of cases are legally complex. The complexity lies in the fact that, depending on the facts, a driverless vehicle accident might be a case for negligence, like a typical Maryland auto accident or a products liability case, because of some defect in the vehicle, in the automatic piloting software or in some other aspect of the system.

Seemingly, self-driving vehicles are inevitable on the roads and highways of Maryland. Despite the manufacturers’ claims of perfect safety, there have been plenty of accidents involving driverless cars and trucks. As an example, there was one in Arizona recently involving an autonomous truck. See the media report here.

In that case, the driverless semi-tractor trailer executed a left turn from the middle lane of an expressway and crashed into the concrete median between its lanes and oncoming traffic. As the video shows, the driverless truck narrowly avoided crashing into a pickup truck that was passing on the left. Fortunately, no one was injured in the accident. But the example shows that driverless vehicles are not perfectly safe.

The example also shows how a case might be negligence or product liability. The company owning the truck claimed human error by one of the human safety monitors in the truck, while an investigation has implicated faulty software. Either way, under Maryland law, a victim injured or killed in an accident with a driverless vehicle will be entitled to sue or make an insurance claim.

If the claim is based on human actions, then the victim would need to prove the four legal elements of negligence: duty, breach, causation, and injury. Human safety monitors inside a driverless vehicle will have the same duties as others on the road. These include the duty to keep a careful watch for others using the road and to act in a manner as to prevent injuries to others using the road. The safety monitors might also have additional duties based on their tasks and responsibilities for the vehicle and its software. In the Arizona truck case, according to the report, one of the monitors allegedly failed to reboot the computer system before engaging it. This caused the software programming to execute a “left-turn” command that was 2.5 minutes old. The old command would have been erased — and not executed — had the system been properly and timely rebooted. Likely, had there been injuries, the failure to reboot the system would have been deemed negligence, and the truck’s owner would have been held legally liable under a theory of negligence.

If the claim is based on design or manufacturing defects, then the victim would need to prove those defects under Maryland law. With respect to the Arizona self-driving truck accident, based on the media report, there seems to be an obvious design defect in the software where old and outdated commands are not automatically erased or overwritten.

Contact Waldorf, MD Personal Injury Attorney Robert Castro Today

This article has been provided by the Law Office of Robert Castro. For more information or questions, contact our office to speak to an experienced Maryland personal injury lawyer at (301) 870-1200. We are Waldorf, MD Personal Injury lawyers. Our address is 2670 Crain Highway, Waldorf, MD 20601.

Categories: