Home / Uncategorized / New Autonomous Sense and Avoid sensor technologies enable Safe and efficient integration of Unmanned Aircraft Systems (UAS)

New Autonomous Sense and Avoid sensor technologies enable Safe and efficient integration of Unmanned Aircraft Systems (UAS)

Safe and efficient integration of Unmanned Aircraft Systems (UAS) into Civil Airspace is a key challenge for unleashing their potential for non-military applications  “without reducing existing capacity, decreasing safety, impacting current operators, or placing other airspace users or persons and property on the ground at increased risk” Sense and avoid technology  enables Drones  to detect aircraft & obstacles within the vicinity of the UAV and to execute manoeuvres to restore a safe situation if needed. In addition UASs must ensure that they can avoid the terrain and land without operator intervention, react to contingencies such as engine out and lost link scenarios, and Be reliable and cost-effective.

 

Sense and Avoid may utilize  Stereo Vision, Monocular Vision, Ultrasonic, Infrared, Time-of-Flight and Lidar sensors to detect and avoid obstacles. Manufacturers are also using multiple sensors fusing them together to create the obstacle detection and collision avoidance systems.This obstacle detection and avoidance technology started with sensors detecting objects in front of the drone. Now the latest drones from DJI, Walkera, Yuneec and others have front, back, below and side obstacle avoidance sensors.

 

Sense and avoid is a sequence of functions which, using a combination of airborne and ground-based sensors, are able to perform manoeuvres to avoid collisions and serve as a UAV replacement for the tradition “see and avoid” capability for manned aircraft. The Sense and avoid technology is of two types, one is for drones called manual sense and avoid which relays information to the UAV pilot and the second is the development of completely autonomous sense and avoid technology which removes the need for a pilot altogether. The cooperative technologies depend on cooperation from other aircrafts to know their distance, velocity and altitude and avoid collisions, while noncooperative technologies use active and passive sensors  to determine these parameters  on their own.

 

Non-cooperative technologiesNoncooperative technologies benefit from the fact that they can be used to detect ground-based obstacles as well as those that are airborne. Generally sense and avoid consists of two components, such as separation assurance and collision avoidance. The first function reduces the probability of a collision by ensuring that the aircraft remain “well clear” of each other thereby assuring safe separation, while collision avoidance is related to extreme maneuvers just prior to closest point of approach to prevent collisions in cases where safe separation is lost.

 

Challenges in Autonomous Sense and Avoid

The principle challenge in achieving autonomous sense and avoid technology is that it must be as effective as a human pilot. However developing a system that is able to replicate the human decision making process is incredibly difficult, and thus serves as the greatest stumbling block to the development of the technology. As a consequence the transition from pilot action for emergency manoeuvres to autonomous algorithms may be some way off for the industry.

 

There are many challenges: Making sense and avoid technology small enough and light enough to suit the capabilities of UAVs is proving particularly difficult. The size of UAVs, particularly with the increase of micro and nano UAVs, means that they do not have sufficient payload capabilities to utilize traditional methods such as radar for the detection of other aircraft. As a consequence, size and weight are becoming serious stumbling blocks to the successful development of sense and avoid.

 

Speed adds a further complication: detecting other aircraft which are travelling at a relatively slow speed is a challenge within itself. However identifying aircraft travelling at faster speeds requires a much quicker reaction time from sense and avoid technology in order to avoid collisions, thus creating an even greater hurdle to implementing the technology.

 

Further complexities are imposed by the power consumption and battery life of UAVs, the need for the technology to operate in different conditions, such as variable weather, and generating a sense and avoid system which operates on the same right-of-way rules as civil aircraft and commercial flights.

 

LeddarTech says: Available drones sensing solutions for position and range measurements as well as for collision avoidance are still far from perfect: GPSs and barometers aren’t full-proof—even outdoors—and can’t be relied upon when navigating indoors. Ultrasonic altimeters have very limited range. Optical flow sensors require good lighting and textured surfaces, and camera vision are still a work in progress and tend to be processing-intensive.

 

Integration challenges comprise airworthiness and certification, vulnerabilities of command and control link, human factors such as crew qualifications and training, air traffic management integration, and the lack of a see-and-avoid capability similar to manned aircraft.

 

Sensors

Active systems transmit a signal to detect obstacles in the flight path. Some examples of these active systems are radar and LiDAR. Motion-detection, EO, and IR systems are all examples of passive systems.

 

Sensor have their advantages and disadvantages, airborne radar, for example, measures the range to a target whereas electro-optical sensors do not, which are forming a key element of sense and avoid research. Sense and avoid systems use algorithms that combine data from the varying sensors and convert said data into a situational awareness picture. Systems such as these enable the pilot to alter the flight path of the UAV if a possible collision is detected, however also integrate a safety feature which enables the UAV to move itself if the pilot fails to react quickly enough.

 

Sensor fusion is a process by which data from several different sensors are “fused” to compute something more than could be determined by any one sensor alone. Sensor fusion is a subcategory of data fusion and is also called multisensory data fusion or sensor-data fusion. Many of the DJI drones combine various sensors into their collision avoidance system.

 

These various obstacle avoidance sensors feed the data back to the flight controller which is running obstacle detection software and algorithms. The flight controller has many functions. One of these is to process image data of surroundings which was scanned by the obstacle detection sensors in realtime.

 

Obstacle Avoidance Algorithms

The obstacle avoidance algorithm is the process or set of rules to be followed in calculating the data from the various sensors. The algorithm is a detailed step-by-step instruction set or formula for solving the problem of detecting all types of objects both moving or stationary. Depending on the algorithm, it will be able to compare real time data from stored referenced images of objects and can even build on these images.

 

There are many techniques which can be used for obstacle avoidance including how the algorithm processes the data. The best technique depend on the specific environment and is different for a collision avoidance drone and a robot in a factory.

Doppler radar for small UAVs

Ashok Gorwara and others have proposed Doppler radar for small UAVs. “Doppler radar is proposed for use in this sense and avoid system because in contrast to optical or infrared (IR) systems Doppler can work in more harsh conditions such as at dusk, and in rain and snow. And in contrast to ultrasound based systems, Doppler can better sense small sized obstacles such as wires and it can provide a sensing range from a few inches to several miles. An SAA systems comprised of Doppler radar modules and an array of directional antennas that are distributed around the perimeter of the drone can cover the entire sky.”

 

These modules are designed so that they can provide the direction to the obstacle and simultaneously generate an alarm signal if the obstacle enters within the SAA’s adjustable “Protection Border”. The alarm signal alerts the drone’s autopilot to automatically initiate an avoidance maneuver,” explain the researchers.

 

 

Miniaturized phased array radar

Researchers at the University of Denver’s Unmanned Systems Research Institute have developed a phased-array radar system that only weighs 12 ounces. The radar-based system has advantages over transponder or camera systems because it works in poor visibility – at night or in bad weather. That technology is currently in the testing phase.

 

Echodyne’s flat panel radar hopes to power the next generation of autonomous aviation

Echodyne Corp  has carried out the successful test of airborne Detect and Avoid (DAA) radar on a small Unmanned Aerial Vehicle (sUAV). Echodyne’s radar was mounted on a small commercial drone which flew multiple missions below 400’ over a period of several days. The drone was of a size, payload, and range well suited for package delivery, infrastructure inspection, and agricultural monitoring.

 

Echodyne’s detect and avoid technology enables a drone to “see” moving and stationary obstacles using “radar vision” as the drone flies through the airspace beyond line of sight of its operator. Echodyne’s radar is based on patented Metamaterial Electronically Scanning Array (MESA) technology which enables the radar to deliver high-performance electronic scanning in a smaller, lighter and less expensive form factor than has been previously thought possible.

 

“It’s great to see our technology performing in real-world field tests exactly as designed,” said Echodyne founder and CEO Eben Frankenberg. Tests like this show that advanced radar can be deployed directly on small commercial UAVs to ensure safe beyond line of sight drone operations. Unlike other sensor technologies such as cameras and LIDARs, radar provides accurate tracking of obstacles at long range across a broad field of view in all types of weather.”

Stereo Vision Sensors For Obstacle Avoidance

Stereo vision works in a similar way to 3D sensing in our human vision. Stereoscopic vision is the calculation of depth information by combining two-dimensional images from two cameras at slightly different viewpoints. It begins with identifying image pixels which correspond to the same point in a physical scene observed by multiple cameras. The 3D position of a point can then be established by triangulation using a ray from each camera.

 

The more corresponding pixels identified, the more 3D points which can be determined with a single set of images. Correlation stereo methods attempt to obtain correspondences for every pixel in the stereo image, resulting in tens of thousands of 3D values generated with every stereo image. DJI use stereo vision for obstacle avoidance on the front of their drones. They also combine Stereo Vision and Ultrasonic sensors beneath their drones too.

Ultrasonic Sensors For Detecting Objects (Sonar)

An Ultrasonic Sensor sends out a high-frequency sound pulse and then times how long it takes for the echo of the sound to reflect back. The ultrasound sensor has 2 openings. One of these openings transmits the ultrasonic waves, (like a tiny speaker) and the other opening receives the ultrasonic waves, (like a tiny microphone). The speed of sound is approximately 341 meters (1100 feet) per second in air. The ultrasonic sensor uses this information along with the time difference between sending and receiving the sound pulse to determine the distance to an object.

 

HC-SR04 Ultrasonic Sensor

The HC-SR04 ultrasonic sensor  offers excellent non-contact range detection with high accuracy and stable readings in an easy-to-use package. From 2 cm to 400 cm or 1 inch to 13 feet. This HC-SR04 operation is not affected by sunlight or black material like Sharp rangefinders are (although acoustically soft materials like cloth can be difficult to detect). It comes complete with ultrasonic transmitter and receiver module.

 

AMS ToF Obstacle Detection Sensors

The AMS ToF sensor for obstacle detection and collision avoidance are based on a proprietary SPAD (Single Photon Avalanche Photodiode) pixel design and time-to-digital converters (TDCs), which have an extremely narrow pulse width. They can measure in real time the direct time-of-flight of a VCSEL (laser) emitter’s infrared ray reflected from an object.

This low-power time-of-flight sensing technology from AMS enables host systems to measure distances accurately and at very high speed. Accurate distance measurements are used in various applications including presence detection, user face recognition and advanced cameras.

  • AMS sensors use sophisticated histogram data and smart software algorithms in its ToF sensors giving the following features:
  • are able to detect and cancel out the effect of cover glass.
  • are immune to smudges and to crosstalk caused by reflections from the cover glass.
  • accommodate a large air gap.
  • maintain accurate distance detection independent of the object’s color, reflectivity and texture.
    can measure the distance from multiple objects in the field of view.

 

LIDAR

The primary function of LiDAR sensors is to measure the distance between itself and objects in its field of view. It does so by calculating the time taken by a pulse of light to travel to an object and back to the sensor, based on the speed of light constant.

 

Top of the range sensors such as the Velodyne Lidar sensor used in the Google driverless cars combine multiple laser/detector pairs (up to 64) into one sensor and each can pulse at 20 kHz. This allows for measurements of up to 1.3 million data points per second

 

LeddarTech Vu8 LiDAR Sensor

LeddarTech – Leddar just announced its modular Vu8. The specs make it ideal for autonomous drone use. The LeddartTech Vu8 is a compact solid-state LiDAR which provides highly accurate multi-target detection over eight independent segments. The Vu8 is a compact sensor that detects targets at a range of up to 705 feet (or 215 meters) and weighs 75 grams. The Vu8 is an active sensor that “could be” used for collision avoidance, navigation, and as an altimeter for drones.

 

The Vu8 uses a fixed laser light source, which significantly increases the sensor’s robustness and cost-efficiency compared with any scanning LiDAR solution. According to LeddarTech, the Vu8 LiDAR is “immune to ambient light” and was designed to provide “highly accurate multi-target detection over eight independent segments.”

 

“Leddar solid-state LiDAR technology, with its narrow or wide field-of-view, rich data acquisition, and multi-segment/multi-object detection capability, might be the best all-around sensing solution to provide efficient and reliable spatial awareness for a new generation of UAV,” says LeddarTech.

 

The Vu8 sensor is very suited for navigation and collision avoidance applications in driver assisted, semi-autonomous and autonomous vehicles such as drones, trucks, heavy equipment for construction and mining, shuttles, buses and other public transportation vehicles. Applications such as Advanced Traffic Management System (ATMS) requiring longer ranges as well as wide fields of view will also benefit greatly from the new Vu8 sensor offering.

 

Optical flow

An emerging technology uses biotechnology with the eyes of flying insects as a model for sensing. This technology is referred to as neuromorphic motion detection, and attempts to copy the optical flow that is used by flying insects. Optical flow in insect eyes detects relative motion of contrasts through multiple eye sensors called lenslets.

 

Further projects are also looking to combine sense and avoid with ground-based radar and terrain avoidance capability to enable UAVs to avoid a broader range of obstacles. Currently in use, and of particular value to UAS (especially for ground-based objects) due to their small size, is the synthetic aperture radar (SAR).

 

SLAM Technology For Detecting And Avoiding Obstacles

Simultaneous localization and mapping or SLAM is an extremely important technology when it comes to drones, cars and robots in detecting and avoiding obstacles. SLAM is a process whereby a robot or a device can create a map of its surroundings, and orient itself properly within this map in real time. This is no easy task, and SLAM is currently at the forefront of technology research and design.

 

SLAM technology works by first building a pre-existing map of its environment. The device such as a drone or robot is programmed with pre-existing maps. This map is then refined as the robot or drone moves through the environment.The true challenge of this technology is one of accuracy. Measurements must constantly be taken as the robot or drone as it moves through its space, and the technology must take into account the “noise” that is introduced by both the movement of the device and the inaccuracy of the measurement method. Many of the obstacle detection and avoidance technology in drones use some parts of SLAM. Monocular vision is one such technology.

 

 

Obstacle Detection And Collision Avoidance Technology

The latest high tech drones are now equipped with collision avoidance systems. These use obstacle detection sensors to scan the surroundings, while software algorithms and SLAM technology produce the images into 3D maps allowing the drone to sense and avoid. These systems fuse one or more of the following sensors to sense and avoid; Vision Sensor, Ultrasonic, Infrared, Lidar, Time of Flight (ToF) and Monocular Vision. The latest DJI Mavic 2 Pro and Mavic 2 Zoom have obstacle sensing on all 6 sides. The Mavic 2 uses both Vision and Infrared sensors fused into a vision system known as Omni-directional Obstacle Sensing.

 

The DJI Mavic 2 obstacle sensing system is top drone technology. The Mavic 2 will sense objects, then fly around obstacles in front. It can do the same when flying backwards. Or hover if it is not possible to fly around the obstacle. This technology is known as APAS (Advanced Pilot Assistance System) on the DJI Mavic 2 and Mavic Air drones.

 

In December 2019, the Skydio 2 drone was released. This also has obstacle avoidance on all sides. The Skydio 2 autonomy technology visualizes and calculates what’s happening around the drone. It can then intelligently predict what will happen next and will make accurate decisions multiple times a second. The Skydio 2 quadcopter uses 6 x 4k cameras to build a 3D map of its surroundings, which will include trees, people, animals, cars, buildings and more.

 

 

 

 

About Rajesh Uppal

Check Also

DARPA NEAT will develop tool to identify risk of suicidal thoughts and behaviors in soldiers by analyzing their brain signals

Around 1 million people die by suicide annually. Globally, suicide is the tenth leading cause …

error: Content is protected !!