Home / Technology / Electronics & EW / Hi-Resolution 4D Imaging Radar is emerging as complementary sensor to LIDAR for fully autonomous cars

Hi-Resolution 4D Imaging Radar is emerging as complementary sensor to LIDAR for fully autonomous cars

Although the hardware and software technology for fully autonomous cars is ready today, the cars themselves may still be a decade or more away, vehicle experts said at the recent North American International Auto Show in Detroit. The challenge for automakers and suppliers lies in test and validation, as well as in the education of consumers and marketers. But next comes the challenge of testing and validation of a true Level 5 system. That’s a huge undertaking.” The problems of testing and validation are particularly vexing in the development of SAE Level 5 vehicles – that is, vehicles that must be able to handle every situation, all the time, without the aid of a driver. “We certainly have the technology today for a (SAE) Level 5 prototype vehicle,” Kay Stepper, vice president of automated driving and driver assistance for Robert Bosch LLC, told Design News. “We have the sensors, ECUs, actuators, and the software stack.

 

One of the prime sensors in autonomous vehicles is LiDAR that sends out short pulses of invisible laser light, and measures the time the pulses take to return to the sensor. From this, both the intensity of the target and distance can be measured with excellent accuracy. The results thus obtained can be used to construct a 3-D map of the vehicle’s surroundings. LiDAR technology can identify the contours and contrasts of obstacles which normal cameras are unable to detect, particularly in low light and low contrast situations.

 

After the tragic May 2016 accident involving a Tesla Model S which crashed into a trailer while Autopilot was engaged, Tesla speculated that a possible cause of the accident was the difficulty of the car’s cameras to identify the white trailer against the bright Florida sky. Elon Musk, had previously dismissed the need for LiDAR (Light Detection And Ranging), suggesting the technology “didn’t make sense” in the context of a car. He defended Tesla’s strategy of achieving “full autonomy” using only cameras, radar, and ultrasonic sensors

 

The 4D imaging radar is ideal for the automotive industry. It provides a highly detailed image of the environment in a wide field of view. This means it can detect obstacles on the side of the road. It can also detect smaller targets, such as a person or a bike, even if they are somewhat masked by a large object, such as a tree or truck. The imaging radar can determine whether they are moving, in which direction, and provide the vehicle with real-time situational data and alerts.

 

The autonomous driving industry today is still at a proof-of-concept phase. It relies on sensors that may not operate 100 percent of the time. High-resolution imaging radar is the only sensor that always performs at required levels. It also dramatically reduces processing power and server needs. High-quality radar post processing would resolve the current prototypes’ main problem – power consumption – by pointing camera and LiDAR only at areas of interest.

 

Last, but not least, the mass production cost of the autonomous sensor suite will need to cost less than $1,000. Some of today’s vehicles being tested use components and systems costing a hundred times this price. Since the imaging radar can achieve level 3 and higher without the need for more than one LiDAR unit per vehicle for redundancy, or possibly no LiDAR at all, it can help manufacturers reach cost reduction targets.

 

According to German supplier ZF Friedrichshafen’s CEO, Stefan Sommer, self-driving cars require multiple detection systems, including expensive LiDAR technology, if they are to be safe at high speeds. “For autonomous driving, we will need three core technologies: picture processing camera technology, short and long-range radar, and LiDAR,” Sommer added.

 

“The 4D imaging radar’s ability to detect at the longest range of all sensors,” he writes, “gives it the highest likelihood to be the first to identify danger. It can then direct the camera and lidar sensors to areas of interest, which will considerable increase safety performance.” At this distance, and for this purpose, the accuracy of the sensor can be relatively low, says Kobi Marenko, CEO and co-founder of Arbe Robotics

 

LIDAR as sensor

In addition to distance measurements, high quality LiDAR sensors also accurately measure reflectivity that allows for easy detection of retro-reflectors like street-signs, license-plates, and lane-markings.

 

In a graphical demonstration of LiDAR’s low light performance, Ford recently navigated a Ford Fusion Hybrid autonomous research vehicle with no headlights along a twisty stretch of desert road at night, guided only by LiDAR. The test demonstrated that even without cameras, which intrinsically rely on light, Ford’s LiDAR coupled to the car’s virtual driver software, is sensitive enough to flawlessly steer around objects in total darkness.

 

 

LIDAR as technology still faces several challenges:

  • Currently, the technology is very expensive. High resolution LiDARS are made in small quantities and cost more than a car
  • Newer LiDARs are now appearing at sub-$1,000 price points)
  • The image resolution is marginal. Currently, images are typically resolved at 64 pixels high, at about a 10hz rate
  • Operating range is limited. Typical LiDARs see well to about 70m, but can identify larger objects, like cars, to around 100m. 1.5 micron LiDARS, which are even more expensive, can see further
  • Scanning LiDARs have moving parts so they can scan the world, but this adds complexity
  • Refresh rates tend to be sluggish
  • Scanning LiDARs are also sensitive to distortion due to the movement of the scanning car and the movement of the objects being scanned.
  • Reduced LiDAR efficacy in adverse conditions such as rain and snow
  • LiDARs require a clear view of the scan-path to function optimally. This often complicates packaging

 

Hi-Resolution millimeter-wave 4D Imaging Radar

Today, radar already plays a vital role in various safety systems, including adaptive cruise control, blind spot detection, and automated emergency braking. However, with the current radar technology on the market, the functional choice must be made between medium resolution at a limited field of view or low resolution at a wide field of view.

 

The millimeter-wave spectrum has become the focus of attention inrecent years also because the lower frequency bands are filling up very quickly. New wideband applications, such as WLAN and radar, require large bandwidths, which are readily available at millimeter-wave frequencies. Today millimeter waves are increasingly used in car radar, cloud radar, radar, and radiometry for concealed weapon detection(CWD), high-speed wireless access, ultra-high-speed wireless local areanetworks (WLAN), and other means of communications including radar-based communication systems. However, because of rather large absorption in the atmosphere, millimeter waves are used mainly in short-range radar systems.

 

Achieving high-resolution using radar

To achieve level 4 and 5 vehicle autonomy, it is essential for automakers to move to the next level of sensing technology and use high-resolution imaging radar that can sense the environment at a wide 100-degree field of view in high-resolution at 1-degree azimuth and 2 degrees in elevation. Additionally, the 4D imaging radar’s ability to detect at the longest range of all sensors gives it the highest likelihood to be the first to identify danger. It can then direct camera and LiDAR sensors to areas of interest, which will considerably increase safety performance as shown in figu

 

Imaging radars are also able to provide true path planning because they can create a detailed image of the road at a range of more than 300 meters (1,000 feet) and capture the size, location, and velocity data of objects surrounding the car. A special focus is placed on object separation by elevation. This enables the radar to recognize whether the car is facing a stationary object right in front of it and must stop or a bridge that it can safely drive under.

 

Another important level 4 and 5 autonomous driving differentiating factor is the 4D imaging radar’s ability to filter out false alarms. To provide optimal sensitivity, the radar uses the lowest detection threshold, so some noise is reported. Post processing and tracking are used to filter out random noise, while calibration schemes allow extremely low side lobe levels to be reached.

 

Arbe Robotics Applauded by Frost & Sullivan for its Breakthrough in 4D Imaging RADAR System

Based on its recent analysis of the global four-dimensional (D) imaging RADAR market, Frost & Sullivan recognizes Arbe Robotics with the 2018 Global Technology Innovation Award for its full-stack 4D imaging RADAR system for the automotive environment. Compared to other technologies, it is cost-competitive and boasts high reliability, low latency, and ultra-high-resolution image capture for long-range objects, making it suitable for applications across multiple industries such as security, surveillance, autonomous driving, and robot guidance.

 

“Arbe Robotics’ patent-pending 4D imaging RADAR technology prototype can record 4D coordinates and image data for outdoor applications, even in challenging environments,” said Varun Babu, Senior Research Analyst at Frost & Sullivan. “As it combines hardware and software technologies to capture 4D data in real time, the imaging products developed using this technology have longer range (300 meters), accurate range resolution (10 to 60cm), a wide field of view (more than 90 degrees), high horizontal and vertical resolution and can precisely decipher target objects in any weather or lighting condition. The system can accurately distinguish between objects, eliminate false alarms, and reduce sensor suite development costs.”

 

Arbe Robotics’ system is comprised of an end-to-end ultra-high-resolution RADAR system, signal processing, simultaneous localization and mapping (SLAM) algorithms, and 4D mapping. One of its distinct advantages is that it is resilient to mutual radar interference; this is a significant feature in the automotive industry, where radar is finding increasing application. The technology has the potential to replace some of the LiDAR systems currently being used and developed for ADAS and autonomous vehicles.

 

“Arbe Robotics has filed 10 patents related to the method of real-time simultaneous localization and mapping, detecting target objects, and excluding noise from RADAR signals, thereby creating additional technology licensing revenue streams,” noted Varun Babu. “Overall, as automotive OEMs seek associated, future-ready technologies to boost the adoption of next-generation vehicles such as autonomous cars, Arbe Robotics’ sensing systems will prove highly popular.”

 

It is disrupting autonomous vehicle sensor development by bridging the gap between radar and optics with its proprietary imaging solution that provides optic sensor resolution with the reliability and maturity of radar technology for all levels of vehicle autonomy.

 

References and Resources also include:

https://www.sensorsmag.com/components/why-hi-resolution-radar-a-game-changer

http://www.prnewswire.co.in/news-releases/arbe-robotics-applauded-by-frost–sullivan-for-its-breakthrough-in-4d-imaging-radar-system-701417512.html

https://www.automotive-iq.com/autonomous-drive/articles/will-lidar-give-fully-autonomous-cars-clear-vision-road-ahead

About Rajesh Uppal

Check Also

DARPA’s REMA Program: Ensuring Drone Resilience to Electromagnetic Interference

Introduction Commercial drone technology has witnessed significant advancements, expanding its applicability in both civil and …

error: Content is protected !!