Self driving vehicles - what role does radar play?

Self driving vehicles - what role does radar play?

Self-driving cars are a hot topic right now. New advances in computer vision have meant that we are closer than ever to be able to build a vehicle which can drive itself safely around our cities and roads. There a lot of big-name companies such as Google, Tesla, General Motors, Ford, Nissan, Toyota, and BMW who have invested heavily into this technology. While this technology is not yet ready for prime time, there is a big prize for the companies that can crack it.

The foundation of a driver-less car is good quality data in combination with a lot of processing power and very smart algorithms. The necessary data for self-driving cars comes from a few main sources: 3D cameras, scanning lidar, ultrasonic, and radar sensors. Each technology has its own unique pros and cons and no technology on its own is enough to provide all the necessary information needed for self-driving.

3D Cameras

3D cameras, also known as stereo cameras, are a special type of sensor that have two cameras separated by a short distance. The principal these cameras operate is like human vision. Images from each camera are captured and compared with each other. Through some impressive mathematical techniques, depth information can be extracted. 3D cameras are great because they can detect depth and shape very accurately. The images from the cameras can also be passed through neural networks to identify objects of interest. This information combined with the depth information gives a pretty good view of the world. There are some drawbacks however. Being cameras that works on similar principals to human vision, they have similar limitations. They can only see when there is enough light and they don't work very well when it is raining or foggy. In addition to struggling in low light, they also struggle in bright lights (eg. when looking directly into the sun or oncoming headlights). The speed of objects can only be inferred by comparing individual frames, making speed detection both difficult and error prone.

Scanning Lidar

Lidar, or Light Radar, is a technique which uses incredibly fast electronics, precision motors, and very specific pulses of light to "see" the world.

A lidar sensor involves shooting very specific light beams at an object and then measuring how long it takes for that beam of light to be reflected back to the sensor. These types of sensors are called time of flight sensors because they measure distance to and object by measuring how long it takes for a light beam to travel there and back again. On autonomous vehicles, lidars are usually in the form of scanning lidars which is basically a lidar sensor placed on a motor and spun around at speed. The result is a 2D map of the surroundings. scanning Lidars are great at measuring distances accurately, but there a lot of disadvantages to them as well. There is a lot of precision engineering required to make a high resolution, robust lidar; this makes these sensors very expensive. The sensor also needs to be placed on the vehicles roof in order to get a 360-degree view of surroundings. Lidars also suffer from some of the same issues as 3D cameras. If there is too much ambient light, the sensors get overwhelmed and will not work. They are also cannot see in rain or fog very well.

Not all self-driving cars rely on Lidar. Elon Musk or Tesla has famously said "Lidar is a fool's errand, anyone relying on lidar is doomed. Doomed! [They are] expensive sensors that are unnecessary."

Ultrasonic

Ultrasonic sensors are used to for short-range detection (up to about 5m). They don't provide high resolution information, but they are good because they work in a wider variety of environmental conditions such as rain, snow, or fog. Ultrasonic has its place when it comes to parking assistance or detecting nearby obstacles but is otherwise quite limited in its ability to the information needed to enable autonomous driving.

Radar

This brings us to Radar. Radar has been used in the automotive industry for over 20 years. Of all the technologies used to provide information for self-driving cars, radar is one of the most mature. Radar is used for automatic emergency braking, blind-spot detection, lane-change assist, vehicle-exit assist, and pre-crash warning systems. In addition to this radar is a key ingredient in adaptive cruise control. Adaptive cruise control technology is widely regarded as a key component of future generations of intelligent cars because it significantly impacts driver safety and convenience as well as increasing road capacity by maintaining optimal separation between vehicles and reducing driver errors.

While radar sensors don't have a great resolution, but they are very good at sensing long distances and measuring speed. They are ideal for detecting obstacles such as other cars, or people, or cyclists.

Strength lies in differences

Modern self-driving cars use a combination of sensors to get the quality information they need to make decisions. No one sensor can do it alone. As an example, Tesla cars are equipped with eight cameras (covering a complete 360° around the car), one forward-facing radar, and twelve ultrasound sensors (also covering a complete 360°).


(source: Tesla)

Radar technology is tried and tested in the automotive industry. What benefits could radar bring to your projects? Have you considered integrating a radar sensor?

References

https://en.wikipedia.org/wiki/Tesla_Autopilot
https://www.wired.com/story/the-know-it-alls-how-do-self-driving-cars-see/
https://www.electronicdesign.com/markets/automotive/article/21806443/how-will-radar-sensor-technology-shape-cars-of-the-future

Back to blog