6 min read
I know it looks like 3YD but it’s actually BYD it stands for Build Your Dreams
6 min read

Autonomous driving expert Sebastian Thrun predicts the pure vision approach will be successful by 2026. This technical method uses only cameras and deep neural networks to drive a car.
This contrasts with the multi-sensor fusion method, which combines cameras with more expensive hardware, such as LiDAR. Thrun co-founded the Google Self-Driving Car project in 2010. He believes this lower-cost vision approach will create a massive advantage over the more expensive systems.

The most significant advantage of the pure vision solution comes from a massive difference in hardware costs. Recent pricing places automotive LiDAR in the $200–$500 range at volume, while automotive-grade camera modules typically cost tens to hundreds of dollars, depending on specs and qualification.
For highly automated cars, a single vehicle may need several LiDAR units, potentially adding $2,000 or more to the final price. This financial difference makes the camera-only approach much easier to use for mass production.

One technical difference is that pure vision systems, such as Tesla’s Full Self-Driving, utilize eight cameras to capture images. The system converts this data into a Bird’s-Eye View (BEV) to represent the environment in three dimensions.
At Tesla’s 2019 Autonomy Day, the company claimed its FSD computer could process 2,100–2,300 FPS of camera imagery. The challenge is the software’s ability to handle complex situations, such as poor lighting and extreme weather conditions, without requiring additional sensors.

For comparison, LiDAR uses laser light to create precise, three-dimensional maps of the environment. Its detection range is typically limited to between 200 and 300 meters.
Radar, which uses radio waves, has a much longer detection range, often spanning several kilometers. A key advantage of radar is its ability to see in all weather conditions. However, LiDAR provides a superior resolution for detailed environmental mapping.

This sensor technology debate affects a large and growing industry. The global market for autonomous driving software is projected to be valued at $2.30 billion by 2025.
It is expected to reach $7.24 billion by 2034, representing a compound annual growth rate (CAGR) of 13.62% during that period. North America was the leading region in 2024, accounting for the highest market share of 39%. The Level 3 automation segment is expected to see the fastest growth rate in the near future.

Experts like Sebastian Thrun pioneered the technology. His influence is significant, starting with the DARPA Grand Challenge. In 2005, his Stanford Racing Team won the Challenge with the robotic vehicle named Stanley.
Thrun also served as a Vice President at Google and co-founded Google X, the company’s research and development facility. His early work contributed to the development of the Google Self-Driving Car project, which later evolved into Waymo

Despite the prediction, pure vision systems face serious problems. Adverse weather conditions create difficulties for camera-only systems. Heavy rain, snow, and dense fog drastically reduce a camera’s ability to see and identify objects.
This problem is similar to how a human driver struggles to see in poor weather. LiDAR sensors are also affected by adverse weather conditions, with their detection performance decreasing in environments such as heavy rain or thick fog.

To overcome these challenges, most leading autonomous vehicle companies use sensor fusion. This approach combines multiple sensor types, including cameras, LiDAR, and radar. Radar is mostly weather-independent, meaning it can detect objects through heavy rain, snow, and fog.
However, radar struggles to see smaller objects or identify the exact shapes of pedestrians. The combined strength of all three sensors creates a significantly safer and more reliable system for all driving conditions.

Companies like Waymo demonstrate this rival strategy to the camera-only approach. Waymo’s self-driving cars use a combination of LiDAR, cameras, and radar sensors. This blend of technology provides the vehicle with a 360-degree view of its environment.
Waymo is significantly expanding its operational footprint across the US. Waymo has begun rider-only operations in Miami and plans to expand to Dallas, Houston, San Antonio, and Orlando, with paid service expected to be introduced in 2026.

The safety level of these systems is key. Autonomous driving systems are defined by the SAE’s six levels of automation. Level two is Partial Driving Automation, where the human driver must stay in control and be ready to take over at any moment.
The crucial difference is at Level four, High Driving Automation. In a Level four car, the system is fully responsible for all driving tasks within a specific area, and the human is not required to intervene.

Supporting this development is the rapid growth of the autonomous vehicle market. The global market size was valued at $2 trillion in 2023. This market is projected to reach $4.76 trillion by 2030, exhibiting a compound annual growth rate (CAGR) of 13.2% during that period.
Asia-Pacific currently holds the largest market share, accounting for 50.44% of the industry in 2022. Ongoing technological advancements in areas like sensor efficiency drive this expansion.
Want to see whether drivers and cities are genuinely prepared for self-driving electric cars? Learn more in are we ready for autonomou EVs?”

Finally, all autonomous systems rely on advanced software that is continuously improved through Over-the-Air (OTA) updates. These updates are delivered wirelessly to the vehicle, allowing manufacturers to remotely fix software-related recalls and bugs.
These remote fixes are expected to save US automakers approximately $1.5 billion annually by 2028. This capability is crucial for enhancing driver assistance systems and ensuring vehicles stay compliant with regulations over time.
Do you think camera-only systems can truly beat lidar? Share your thoughts below.
Curious how each automaker stacks up in self-driving tech? Compare autonomous driving features across leading brands.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN
This slideshow was made with AI assistance and human editing.
This content is FREE for our email subscribers.
Enter your email address to get instant FREE access to all of our content.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Into cars, EVs, and the future of driving? Get free updates on the latest news, reviews, and tips, no junk, just pure driving goodness!
Unsubscribe anytime. We don't spam!

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!