Providing market intelligence for more than 35 years

Broadband and Mobility

What Sensors Do Autonomous Cars Need?

Many autonomous car developers use multiple sensors in their test cars—optical sensors, radars, LiDAR, and ultrasonic sensors. Each sensor has advantages over others. Optical sensors can detect lane markings and changing traffic lights. Radar can travel under vehicles in order to detect objects that would be obscured by other cars, and radar performs well in rain and complete darkness. LiDAR can, more accurately than other sensors, detect the presence of objects to create a 3D map with a margin of error of only a few centimeters. Ultrasonic sensors are already in use for automatic parking.

Autonomous car developers differ about which sensors to use and how to use them. Waymo, formerly Google’s self-driving car project, relies heavily on LiDAR to create 3D maps; Google’s cars also use other sensors to detect objects and place those objects into the map to navigate the environment. Tesla relies heavily on radar to detect its cars’ surroundings, and the company uses data from other vehicles to correct for areas in which radar would deliver misleading results. AImotive, previously ADASworks, is a software company that believes all automated driving can be performed with just optical sensors, and the company plans to use radar, ultrasonic, and (when hardware costs go down) LiDAR as mere redundancies. 

Every autonomous car developer needs to use optical sensors to detect lane markings and traffic lights; AImotive is different in that they believe that if nothing goes wrong with the optical sensors, optical sensors are the only sensors an autonomous car needs. After all, human drivers do not need LiDAR, radar, or ultrasonic sensors. Just as drivers use their eyes, with sufficient software intelligence, shouldn’t camera sensors be all that is required to drive a car? Just as humans only need a 2D map, shouldn’t autonomous cars be able to navigate without a 3D map? AImotive’s answers to both questions are: yes. AImotive’s software uses 12 cameras around the vehicle to stitch together images to give a 360 degree view of the vehicle’s surroundings and can detect object angle, distance, object class, and size. The sensors also detect these objects’ motion and predicts their future path. Using a 2D map, the software calculates a drivable path that avoids obstacles. According to AImotive, their software enables full autonomy in all weather and driving conditions even outside of geofenced areas. 

What about the deficiencies in optical sensors? Optical sensors cannot detect obstacles in the dark, but most cars’ surroundings have enough lighting so that optical sensors can detect objects in the dark. Optical sensors cannot detect obstacles two cars ahead, but neither do humans. Optical sensors face a number of technical problems, like the “white wall problem,” but AImotive claims that its software’s intelligence has solved these problems. The biggest problem for optical sensors is that some weather conditions, like direct sunlight, flood the sensors. Perhaps solving that problem requires additional hardware—a shade or automatic sunglass lens that filters the light when the sensor is flooded—but more likely, this is when other sensors can take over.
Why, then, do we need sensors besides optical sensors? Two answers: 1. Redundancy, or to take over when optical sensors malfunction or experience adverse weather conditions, and 2. To acquire more data. The more data cars have about their environment, the better they can navigate, especially when they’re learning. 

To answer the question in the title: autonomous cars will always need optical sensors. As for radar, LiDAR, and ultrasonic sensors: some of those sensors will always be needed for redundancy, and for now, they are needed to collect more data. 

For more information on autonomous cars and in-car experiences, see Connected Cars: Balancing a Rich Driving Experience with Safety, and look for an upcoming whitepaper and autonomous car industry report from Parks Associates.

Further Reading:

More Press Releases

Thursday, Jul 11, 2024

Parks Associates: After a pandemic peak, familiarity with and ownership of EVs have decreased

Six million US Internet households own an EV New research from Parks Associates’ consumer study o...

View More

Tuesday, Jul 09, 2024

Parks Associates announces new Broadband Market Tracker, annual research service tracking 25+ residential ISPs in the US and Canada

Space X’s Starlink service has an estimated US 1.3 million residential subscriptions, creating new m...

View More

Tuesday, Jun 25, 2024

Parks Associates: Average spending on streaming services drops from $90 to $64 per month

Parks Associates presents research at StreamTV Show This week, Parks Associates will share new re...

View More