Waymo car in the light fog and rain of San Francisco
Robocars use 3 primary sensors – cameras, LIDAR and radar. That started with an existing automotive radar that isn’t discussed much because it’s pretty simple. While Tesla has stated to TSLA that they take pride in taking out their radars (but probably flaring a bit) pretty much everyone relies on it because it’s way cheaper than LIDAR and sees the world in a completely different and superhuman way.
Waymo recently reported about the new radar in his vehicle 5. Today they released a few more details. Several startups have also developed an imaging radar and internal work has continued with other Robocar teams. There were even rumors that Tesla was using ARBE’s radars before they did a U-turn.
Conventional automotive radar has very low resolution. Horizontally, it can be happy if it can see which lane a radar target is taking. Vertically, it is often indistinguishable between a car on the road in front of you and a bridge over the road. It’s a world with very fuzzy blobs, but one that has several worthwhile advantages:
- You can get a good range measurement (like LIDAR) for any destination, though there can be confusion due to multipath returns bouncing off other things in the world.
- You also get the speed of each target, which is hugely valuable and otherwise only given by new Doppler LIDARs.
- When working in radio waves, things like dust and fog are transparent to them, while they greatly reduce the range of the light-based sensors.
- You can work very long distances.
- While the multipath returns (bouncing off) can be a curse, they’re also a blessing, as you can spot targets that are blocked from your view, such as between you.
Image from Waymo radar image on a foggy day
There are things that shouldn’t be loved – the poor resolution, low reflectivity from pedestrians, noise and multipath returns, and costs that are higher than cameras. In fact, the resolution is so poor that radar is mainly used to detect moving targets. Because radar tells you how fast a target is moving towards or away from you, those targets stand out from anything stationary in the world. You get reflections from stationary objects (like a stopped car in front of you), but it’s hard to reliably distinguish them from all the other stationary objects – like the road, fences, signs, and more. Previous radar users simply had to ignore any returns of solid objects, which is why radar-equipped Teslas were seen plowing into the side of trucks crossing the street and emergency vehicles in the left lane.
One answer is to greatly increase the resolution, which the imaging radar does. While classic radars can reach a resolution of 5 degrees (and worse vertically), imaging radar tries to reach 1 degree, and some claim even less at 0.5 degrees. However, the resolution is not quite like light. Even if an imaging radar can tell you where a target is within 1 degree, that doesn’t mean knowing that two targets are one degree apart is really good at knowing that two targets are one degree apart.
The result, however, is that the radar creates something like the “point cloud” of a lidar. People like to call this 4D radar, because for each point you learn X and Y but also distance and speed. More specifically, it’s 5D because if you look at how things change over time, you can learn even more about it.
Waymo isn’t going to say much about their new radar other than reviewing all of the commercial offerings and finding that they can’t keep up with what they can do in-house. Many companies, including ARBE, Aptiv APTV, Bosch, Conti, Infineon, Magna, Metawave, Lunewave, ZF, Vayyar, Oculli, SRS and many others develop or even sell imaging radars. Google Googles and its cousin Waymo may have their origins as a software company, but they make hardware when the world doesn’t sell it to them, which is why there is also a custom LIDAR on Waymo cars, but the vehicles are derived from modified standard OEM cars . Waymo has been researching its sensors for longer than anyone else in the field.
Waymo also has a very good LIDAR with a very long range and high resolution. Your cameras and AI can also see things from a great distance. Some companies have lidars that report the speed of targets just like radar does. Is radar a must?
Waymo’s philosophy is to preserve all available data to improve safety and functionality. The cost is not the same problem with a robotic taxi, so invest all the money and effort initially to get to safety faster. The more you know about the world, the better. Waymo would have wanted this additional data in Arizona, but it was especially worthwhile in their new San Francisco operating area, famous for its fog. In SF, it’s easy to get caught in fog and blur your view. You can handle that better with image radar. No man’s radar is good enough to drive in pea soup with no vision or LIDAR – although some hope it might one day be possible – but it can make it possible to drive in worse weather. Radar works fine in snow but can be disturbed by very heavy rain, although sometimes not as much as visibility.
This is important for a robotic taxi. You don’t want a robotaxi service that shuts down when the weather changes unless it’s something massive that blocks most of the roads. Private robocars are given a pass here – they can announce that self-driving is disabled, but owners can still drive them if they have a bike.
Waymo says the accuracy of their image radar will help them better distinguish targets on the road. Radar targets vary widely in intensity – things like flat metal signs, license plates, and more are very bright, while animals and people can be very dark. This means that the signal from a bright reflector can drown out the reflection (up to 60 dB dimmer) from someone nearby, unless you can resolve it to a fine angle.
In this massive fog accumulation in 2002, 5 people died in Georgia
Many will date the beginning of the Robocar era to the DARPA challenges, especially the November 2007 Urban Challenge. On the day of that challenge, not too far away, there was a pile-up on CA Highway 99 with 108 cars, 2 and hurt many more. It was caused by thick fog and human drivers zooming into it at high speed. A car equipped with image radar would not have done this as it would be able to see through the fog as if it weren’t there.
Tesla took out their radars. While most suspect that the reason is related to the lack of chips, since they kept radars in their S and X models, they now state that radar only makes self-driving difficult. The argument is that if you have multiple sensors you need to fuse them together. You need to understand that the target you see on the radar and the target you see on the camera are the same. If you make one mistake doing this, it can lead to other mistakes. If two sensors don’t match, which one do you trust? It’s not always easy. It could mean your system is more complex than it needs to be – argues Tesla. Almost everyone else disagrees.
However, this isn’t the first time people have dropped radar. Early ADAS were all done with radar, but these units were expensive and ignored things like blocked cars. They also often made mistakes in tracking a car. MobilEye came forward with its camera-based system. It was cheaper than radar (although it would incorporate radar if you had it) and didn’t make the same mistakes as radar. (It made its own mistakes, however, that radar wouldn’t make.) Because it was cheaper, many OEMs switched to it to provide features like forward collision warning and adaptive cruise control. ACC was now working with cars stopped in traffic, which is what many customers wanted.
Even so, MobilEye uses Radar and LIDAR in its Robocar plans, but continues to offer single-camera driver assistance. Driver assistance doesn’t have to be perfect – it’s just there to help the driver, not to take over. When Tesla used MobilEye’s product in an early autopilot beyond spec and a Tesla crashed into the broadside of a truck and killed the driver, MobilEye severed its relationship with Tesla at a great cost to itself (however, they were fine and sold it for $ 16 billion to Intel INTC.) Tesla was already working on its own system and deployed it in a hurry, losing some functionality. When Tesla pulled out the radar, they also reduced functionality.
Many companies are also using AI techniques to understand data from radars. This includes both removing the noise that radar returns are full of and identifying targets from their profiles as they change over time. For example, if you look at a radar output as a human, it turns out that cyclists are very obvious from their radar signatures because their legs and pedals are constantly spinning back and forth – this is very evident on a radar over time, and AI tools can seek that out and identify the target. AI techniques help identify things that are difficult for humans to see, such as motorcycles leaving the trail and other specific targets on the road.
Radar continues to get a lot cheaper. In fact, some vehicles, trying to cut costs, rely on radar on the sides or behind for certain applications like detecting cross traffic (which is never stationary) and don’t spend as much on LIDAR in that direction.
The weather may look bleak, but the future looks bright for radar on robocars.