9 min read
I know it looks like 3YD but it’s actually BYD it stands for Build Your Dreams
9 min read

Self-driving cars utilize a computer system called Artificial Intelligence (AI) to make rapid driving decisions. This is a considerable change from regular cars, where only a person makes the decisions. When these smart cars crash and hurt someone, figuring out who is truly to blame becomes a brand-new puzzle for the law.
The old injury law was only written for mistakes made by human drivers. On August 26, 2025, the California DMV officially recorded over 875 collision reports involving autonomous vehicles being tested. This growing number indicates that new laws must be enacted to address this high-tech issue fairly.

In California, liability and enforcement depend on who is in control of the vehicle. When a safety driver is present, standard traffic law applies to the human. For driverless operation, AB 1777 (2024) establishes a process for citing the manufacturer via a notice of autonomous-vehicle noncompliance, effective July 1, 2026.
This approach treats the AI car like any other machine that a person should always supervise. For testing permits, manufacturers must hold a large insurance policy or bond worth $5 million for each autonomous vehicle they operate. This high amount is required by California law to cover the serious potential risks associated with the new technology.

It is challenging for courts to apply the traditional concept of negligence, which involves determining whether someone was careless, to judge an AI system.
The law must determine whether a computer’s choice, which is made through complex mathematical calculations, can be deemed “reasonable” in court, just as it would judge a human.
California’s SB 572 (2025), a current bill, proposes fining companies for failing to report crashes involving cars equipped with Level 2 assist systems. The bill proposes a fine of $26,315 per day for each instance in which a manufacturer fails to report a crash.

The idea suggests that the AI system is a defective product due to poor programming or a design flaw. If the AI is defective, the law can shift the blame from the car owner to the technology companies that built the system.
In August 2025, a Florida jury decided that the car company Tesla was partly to blame for a deadly car crash that happened in 2019. The jury ruled that both Tesla and the driver were at fault.
The jury ordered Tesla to pay approximately $243 million to the victims’ families, which included an additional amount to penalize the company. Tesla said it will fight the decision.

It is very challenging to prove that an AI system caused a crash because it requires thorough checks of the car’s software systems and sensor data. Lawyers must hire technical experts who can understand and explain the computer’s internal decisions, often referred to as a “black box.”
California’s SB 480 (2025) requires autonomous vehicles to retain sensor data for at least 30 seconds prior to a collision occurring. This data must be stored in a read-only format so companies cannot change it. This crucial rule helps people who are hurt prove exactly what the AI was doing right before the accident.

Accidents involving self-driving cars often involve multiple parties who could all be responsible for the crash. This can include the car maker for the brakes, the software designer for the AI brain, and the sensor companies for bad sensors. The total responsibility is often shared among these different parties.
According to data collected by the National Highway Traffic Safety Administration (NHTSA) from July 2021 to mid-2025, vehicles with Level 2 ADAS (requiring a human driver) were involved in a much higher number of crashes and fatalities than fully autonomous vehicles.

When the blame for an AI-caused crash is shared among multiple parties, receiving compensation for injuries is challenging because regular car insurance may not cover the error of a technology company. A meticulous investigation into the exact cause of the crash must always be conducted.
For fully self-driving cars, a study by Waymo and Swiss Re found that Waymo had 92% fewer bodily injury claims and 88% fewer property damage claims than human-driven cars. This data suggests that the risk of accidents is shifting away from drivers and toward the vehicle manufacturers and technology.

California lawmakers and courts are working diligently to establish new legal guidelines for AI car liability. They aim to establish clear standards for how computer brains must perform and what car manufacturers are responsible for, ensuring that injured people receive fair compensation.
California’s new autonomous vehicle regulations now require manufacturers to submit clear steps for how their cars interact with law enforcement to the DMV. The DMV can now issue fines and temporary permit suspensions to companies that fail to follow these rules.

The legal system must be careful to support new technological innovation while ensuring that people hurt in AI-caused accidents get fair compensation. This requires setting special, strict safety requirements for autonomous vehicles.
New California laws require self-driving vehicles to be designed to safely recognize and respond to signals from first responders, such as police and fire trucks. Manufacturers must also install special fail-safe mechanisms that enable emergency workers to disable the car if it becomes stuck or experiences a software problem.

California SB 1398, which took effect in 2023, is a significant law that prohibits car manufacturers from using confusing terminology for their driving-assist features. The law bans words like “self-driving” for cars that still need a human to pay attention, because those words often trick drivers into trusting the system too much.
This law requires car companies to provide clear warnings about the car’s actual performance limits. A 2023 NHTSA study, as reported in a legal analysis, found that Tesla’s Autopilot was a factor in nearly 49% of the 956 total reported ADAS crashes in the study across the US.

California AB 1777 (2024) is a significant new law that alters the responsibility for issuing traffic tickets when a driverless car violates the law. If a fully autonomous vehicle commits a traffic violation without a person in the driver’s seat, the manufacturer is the one who must be cited, not the human operator.
This law requires a fully autonomous vehicle manufacturer to maintain a dedicated emergency response telephone line for police to use in the event of an emergency. Additionally, by July 1, 2026, manufacturers must equip each autonomous car with a two-way voice communication device for emergency communication.

California requires manufacturers testing self-driving cars to report any crash that causes property damage, injury, or death to the DMV. This rule helps the state track the safety of the new technology on public roads.
The California DMV’s official safety dashboard reports that as of August 26, 2025, the state has recorded over 875 Autonomous Vehicle Collision Reports. These reports cover crashes involving autonomous vehicles with testing permits from October 2014 to 2025.

Waymo, a company that operates self-driving ride services, has been involved in numerous reported incidents during its operations. The accident numbers offer a real-world perspective on how the technology performs on the road.
Between 2021 and 2024, 696 Waymo accidents were reported across three primary states: California, Arizona, and Texas. Most of these accidents, 454 incidents, occurred in California alone, indicating that many AI vehicles are operating there.

Tesla Autopilot is a Level 2 Advanced Driver Assistance System (ADAS), meaning the human driver must always remain engaged and supervise the car. It is not fully self-driving, and a driver must be ready to take back control at any moment.
NHTSA’s Standing General Order database has recorded thousands of Level-2 ADAS crashes since mid-2021; external analyses indicate Tesla accounts for the majority of those reports, though totals vary by snapshot and filtering.

Setareh Law is well-equipped to handle the complex technical and legal issues associated with AI-related car accidents in California. They combine their knowledge of old injury law with the high-tech understanding needed for new technology issues, which requires working closely with technical experts.
The firm has recovered over $250 million in compensation for its clients in personal injury cases. They also boast a strong history of client satisfaction, having received over 400 five-star reviews on Google.
Curious how new insurance limits could impact your wallet? Read more in what higher state minimum car insurance limits mean for drivers in 2025.

Setareh Law collaborates with specialized professionals, including accident reconstruction specialists and industry experts, to gather detailed evidence for AI cases. They build strong legal arguments that hold all responsible parties accountable, from software developers to car manufacturers.
The firm’s legal team has over 60 years of combined legal experience in personal injury law throughout California. They operate on a contingency fee basis, which means clients do not have to pay any attorney fees until the firm successfully wins the case and recovers compensation for their injuries and damages.
Want to see how Tesla handled past Autopilot crash claims? Check out Tesla settles lawsuits tied to 2019 California Autopilot crashes.
Share your thoughts—should the driver or the maker pay?
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN
This slideshow was made with AI assistance and human editing.
This content is FREE for our email subscribers.
Enter your email address to get instant FREE access to all of our content.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Into cars, EVs, and the future of driving? Get free updates on the latest news, reviews, and tips, no junk, just pure driving goodness!
Unsubscribe anytime. We don't spam!

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!