Home / Technology / AI & IT / Autonomous take-off and Landing technologies to allow UAVs to operated in tactical battlefield and integrate into National Air Space

Autonomous take-off and Landing technologies to allow UAVs to operated in tactical battlefield and integrate into National Air Space

An unmanned aerial vehicle (UAV) ( commonly known as a drone) is an aircraft without a human pilot on board. UAVs are a component of an unmanned aircraft system (UAS); which include a UAV, a ground-based controller, and a system of communications between the two. The flight of UAVs may operate with various degrees of autonomy: either under remote control by a human operator or autonomously by onboard computers referred to as an autopilot.

 

These vehicles have been used in several types of applications like surveillance, infrastructure inspection, fire fighting, search and rescue, agriculture, border patrol, scientific experiments, and mapping. Communication, sensor and control techniques have evolved over the past few decade that has led to the development of a wide range of UAVs varying in shape,size, configuration, and characteristics. The common types of UAVs are fixed wing UAVs, Quad-rotors and helicopters at different scales (large UAVs or miniature vehicles or microaerial vehicle).

 

All these applications depend on advancements in vehicle autonomy that will fully automate many of the vehicles functions, including route planning, navigation, obstacle avoidance, landing zone evaluation, autonomous take-off and autonomous landing.  Furthermore, some of the applications mentioned above, such as package retrieval, require a high level of accuracy so as to make sure the UAV lands exactly on the desired target.

 

UAVs have become indispensable to modern militaries in providing intelligence, near-real time reconnaissance and surveillance to commanders, and offering warfighters greater battlespace awareness. They have proven effective in electronic combat support, battle damage assessment and even in national security operations like border surveillance, low intensity conflict and guerilla / terrorist warfare. The ability to take-off and land in tactical cluttered environments will allow UAS to be used more extensively in support of forward military units.

 

In September 2019, a Cessna 172—a four-seat, single-engine plane that is among the most common aircraft model in existence—taxied to a runway at an airport south of San Jose, took off on its own, flew for 15 minutes, and then landed at the same airport, all without a single person on board.

Integration of Unmanned Aircraft systems in the National Airspace System (NAS)

The UAS market is forecast to explode into hundreds-of-thousands of units within just a few years of the FAA establishing the appropriate regulatory procedures for the operation of UAS in the National Air Space (NAS). The global drone logistics and transportation market is projected to reach $29.06 billion by 2027.

 

While the technology for unmanned air vehicles operating day in and day out without constant human supervision is maturing steadily, much remains to be done to make these vehicles commonplace. NASA has identified a number of challenges that must be addressed for these vehicles to safely and efficiently conduct their tasks in the National Airspace System (NAS).

 

Civilian applications of UASs must make sure that they can:
1. Sense and avoid other vehicles and follow air traffic commands,
2. Avoid the terrain and land without operator intervention,
3. React to contingencies such as engine out and lost link scenarios, and
4. Be reliable and cost-effective.

 

Autonomous take-off and Landing technologies

UAV flight consists of different phases, namely, takeoff, climb, cruise, descent and finally landing. Most of theUAV autopilots have autonomous take-off (catapult and handlaunched) and cruise but limited autonomous landing capa-bilities due to high risks and reliability issues. The accuracy of the landing must be high otherwise the aircraft may crash. Autonomous landing is one of the most challenging part ofthe flight. Landing must be done in a limited amount of timeand space. Hence precise sensing techniques and accuratecontrol is required during this maneuver

 

Landing multi-rotor drones smoothly is difficult. Complex turbulence is created by the airflow from each rotor bouncing off the ground as the ground grows ever closer during a descent. This turbulence is not well understood nor is it easy to compensate for, particularly for autonomous drones. That is why takeoff and landing are often the two trickiest parts of a drone flight. Drones typically wobble and inch slowly toward a landing until power is finally cut, and they drop the remaining distance to the ground.

 

A number of factors need to be considered for a smoothlanding namely the type of landing (indoor or outdoor),visibility, type of terrain, wind disturbances etc. There aretwo aspects to landing namely sensing and control. Cameravision is a popular sensing technique to estimate the POSE(position and orientation) of the UAV whose information isused by the controllers. The type of controllers can rangefrom simple linear control to complex techniques involvingintelligent and hybrid control systems.

 

Figure  shows the block diagram of a generic landingcontrol system. The system consists of four blocks namelysensors/navigation system, guidance controller, flight controller and type of the UAV. The sensors/navigation systemmainly determines the POSE of the UAV. This information is fused for the flight and guidance controllers. The guidance controller generates guidance commands like change in velocity, acceleration and rotation to follow a desired trajectory.The flight controller takes the guidance command to generate the appropriate actuation commands according to the type ofthe UAV (VTOL or Fixed wing)

 

Figure 1 from A survey of autonomous landing techniques for UAVs | Semantic  Scholar

 

A typical landing system uses GPS (Global positioning system) and INS (inertial navigation sensors). The height measurement from the GPS is inaccurate and hence a close range sensors like radar altimeter, or barometric pressure sensor is also used in conjunction with GPS. However, GPS signals may not be always available and hence automatic landing may not be possible in many remote regions. In the case of unmanned helicopters GPS and INS system are suitable for long range and low precision flights but fall short for precise and close proximity flights. Thus there is a need to integrate these systems for better accuracy and reliability. Despite such compensation, these methods still remain inaccurate, especially in the horizontal plane, resulting in a landing position that typically deviates from the intended one by 1 to 3 m. Furthermore, the GPS cannot be used indoors.

 

Computer vision is used in the feedback control loopof an autonomous landing system. Use of vision in thecontrol loop is especially suited for problems where thelanding pad is in an unknown location or non-stationary(the deck of a ship). There are a number of vision-basedcontrol techniques for helipad detection, tracking and landing(both indoor and outdoor). Classical vision-based target tracking and landing focuses on object recognition using edge detection techniques. Although vision provides a natural modality for object detection and landing, but it can only sense the changes due to the applied forces not the forces themselves. Hence vision-based techniques are integrated with conventional control techniques for a good and robust landing design.

 

Indoor landing involves landing in a controlled environ-ment with less environmental disturbances as compared tooutdoor landing. Wenzel et al.  use a Wii remote infrared camera for their visual tracking approach with the contro lalgorithm running on an onboard microcontroller. They track a pattern of infrared spots by looking downwards with a fixed camera. The integrated circuit provides the pixel position of each spot at a high frequency. The estimated pose is used invarious integrated PID control loops to control the vehicle motion.

 

Outdoor Landing is a more challenging problem due tothe presence of external disturbing factors such as wind,visibility, etc. The main part of any vision-based landing is to detect the helipad using object detection or pattern recognition techniques.

 

Guidance based landing: Guidance refers to the determination of desired trajectory from vehicle’s current location to a target, as well as direction, rotation and acceleration for following that trajectory.Proportional guidance and pursuit guidance are two of the widely used laws for UAV guidance. Proportional guidance aims at maintaining a constant angle between the LOS and the target whereas pursuit guidance aims at generating commands to make the velocity vector of the UAV pointtowards the target. There are two main types of pursuit guidance laws namely pure pursuit and pseudo pursuit. Purepursuit guidance law leads the UAV towards a true target,while the pseudo pursuit guidance law generates a guidancecommand to track a virtual target.

 

Autonomous take-off and Landing systems

Some of the technologies that have been utilized for autonomous takeoff and landing :

  1. AVATAR Helicopter:Novatel RT -20 DGPS, CCD camera and Ultrasonic sensors provided Landing with 47 cm average position error. However the solution is very expensive and computationally heavy.
  2. Yamaha RMAX helicopter:CCD camera mounted on an off the shelf, Gyros & accelerometers. It provided Landing with 43 cm average position error (54 cm maximum position error). However the solution is computationally heavy (two PC104 stacks with 700 MHz processors) hence Works with a high payload helicopters.
  3. Small scale helicopter:IMU and Range sensor were used for Auto takeoff and landing with 0.5 m altitude error. However Ultra sonic sensor is not that accurate at high altitudes.
  4. ASCTec Hummingbird Quadrotor:IMU, Infrared Camera, IR LEDs placed in T-shape. 50 indoor VTOL flight tests were conducted, the standard deviation is less than 3 cm in each position axis. The solution will not work properly in outdoor environments. The camera should be close enough to the pattern to detect the IR spots
  5. Mini Quadrotor : Optical flow sensor, GPS, IMU, Sonar sensor. Position stabilization for the Quadrotor at the final meter before touching ground. No estimator was designed.
  6. Cheetah Quadrotor:PX4FLOW Sensor, IMU, Hovering and outdoor flight trajectory. It shows better results comparing to the previously designed optical flow sensors. The output flow is auto-compensated for the 3D rotations

 

At Caltech’s Center for Autonomous Systems and Technologies (CAST), artificial intelligence experts have teamed up with control experts to develop a system that uses a deep neural network to help autonomous drones “learn” how to land more safely and quickly, while gobbling up less power. The system they have created, dubbed the “Neural Lander,” is a learning-based controller that tracks the position and speed of the drone, and modifies its landing trajectory and rotor speed accordingly to achieve the smoothest possible landing.

 

“This project has the potential to help drones fly more smoothly and safely, especially in the presence of unpredictable wind gusts, and eat up less battery power as drones can land more quickly,” says Soon-Jo Chung, Bren Professor of Aerospace in the Division of Engineering and Applied Science (EAS) and research scientist at JPL, which Caltech manages for NASA. The project is a collaboration between Chung and Caltech artificial intelligence (AI) experts Anima Anandkumar, Bren Professor of Computing and Mathematical Sciences, and Yisong Yue, assistant professor of computing and mathematical sciences.

 

Deep neural networks (DNNs) are AI systems that are inspired by biological systems like the brain. The “deep” part of the name refers to the fact that data inputs are churned through multiple layers, each of which processes incoming information in a different way to tease out increasingly complex details. DNNs are capable of automatic learning, which makes them ideally suited for repetitive tasks.

 

To make sure that the drone flies smoothly under the guidance of the DNN, the team employed a technique known as spectral normalization, which smooths out the neural net’s outputs so that it doesn’t make wildly varying predictions as inputs/conditions shift. Improvements in landing were measured by examining deviation from an idealized trajectory in 3D space. Three types of tests were conducted: a straight vertical landing; a descending arc landing; and flight in which the drone skims across a broken surface — such as over the edge of a table — where the effect of turbulence from the ground would vary sharply.

 

The new system decreases vertical error by 100 percent, allowing for controlled landings, and reduces lateral drift by up to 90 percent. In their experiments, the new system achieves actual landing rather than getting stuck about 10 to 15 centimeters above the ground, as unmodified conventional flight controllers often do. Further, during the skimming test, the Neural Lander produced a much a smoother transition as the drone transitioned from skimming across the table to flying in the free space beyond the edge.

 

“With less error, the Neural Lander is capable of a speedier, smoother landing and of gliding smoothly over the ground surface,” Yue says. The new system was tested at CAST’s three-story-tall aerodrome, which can simulate a nearly limitless variety of outdoor wind conditions. Opened in 2018, CAST is a 10,000-square-foot facility where researchers from EAS, JPL, and Caltech’s Division of Geological and Planetary Sciences are uniting to create the next generation of autonomous systems, while advancing the fields of drone research, autonomous exploration, and bioinspired systems.

 

“This interdisciplinary effort brings experts from machine learning and control systems. We have barely started to explore the rich connections between the two areas,” Anandkumar says.

 

Besides its obvious commercial applications — Chung and his colleagues have filed a patent on the new system — the new system could prove crucial to projects currently under development at CAST, including an autonomous medical transport that could land in difficult-to-reach locations (such as a gridlocked traffic). “The importance of being able to land swiftly and smoothly when transporting an injured individual cannot be overstated,” says Morteza Gharib, Hans W. Liepmann Professor of Aeronautics and Bioinspired Engineering; director of CAST; and one of the lead researchers of the air ambulance project.

 

 

NASA Licenses New Autonomous Technology for Unmanned Aircraft

NASA had issued earlier a proposal for Autonomous, Safe Take-Off and Landing Operations for Unmanned Aerial Vehicles in the National Airspace, “We propose to a combination of software algorithms and low-cost, low SWAP sensors that simultaneously solves the navigation and obstacle detection problem, especially as relates to operation in cluttered environments.” That is, in this program we will show that it is possible for small autonomous air vehicles to reliably and safely fly in the first and last 50 feet of operation.

 

One of the solutions currently being worked on is the autonomous aerial vehicle (AAV) with vertical takeoff and landing (VTOL) ability.  “An enhance capability for safe, autonomous take-off and landing will fuel the market’s forecast growth. Technology ensuring the safe operation of UAS, particularly during the first and last 50 ft. of flight, will contribute to testing that verifies the safety of UAS operations as well as providing regulators, legislators, and the general public with increased confidence in UAS operations,” says NASA.  The technology, what some might call a drone helicopter concept, is being tested to carry cargo and eventually taxi passengers.

 

NASA has developed technology that may enable unmanned aircraft to fly safely in the national airspace along with piloted aircraft through its program called Unmanned Aircraft Systems in the National Air Space or UAS in the NAS. The patent-pending integrated communications and control system is capable of collision warnings as well as real-time traffic and weather updates.

 

Additionally, the commercial market is forecast to grow to as many as 160,000 UAS. As soon as UAS operation in the national airspace is fully implemented, the cargo transportation market, in particular, is forecast to be the largest market segment. Autonomous precision take-off and landing will be a key enabling technology in realizing this market

 

The proposed autonomous technology will enable greater utilization of UAS in other NASA areas, particularly for experimentation and testing in the various research centers, for example expanding the utilization of UAS in the Ames FINESSE volcano research. The mature technology will ultimately enable greater use of UAS in space. A UAS that knows its position and is able to set down, avoiding obstacles in a cluttered environment can be used to accomplish repairs both inside and outside a spacecraft, as well as performing exploration of planetary surfaces.

 

Military UAS requirements are well documented and tens-of-thousands of UAS are already in use worldwide. The ability to take-off and land in tactical cluttered environments will allow UAS to be used more extensively in support of forward units.

 

Airbus demonstrates first fully automatic vision-based take-off

Airbus has successfully performed the first fully automatic vision-based take-off using an Airbus Family test aircraft at Toulouse-Blagnac airport. “The aircraft performed as expected during these milestone tests. While completing alignment on the runway, waiting for clearance from air traffic control, we engaged the auto-pilot,” said Airbus Test Pilot Captain Yann Beaufils. “We moved the throttle levers to the take-off setting and we monitored the aircraft. It started to move and accelerate automatically maintaining the runway centre line, at the exact rotation speed as entered in the system. The nose of the aircraft began to lift up automatically to take the expected take-off pitch value and a few seconds later we were airborne.”

 

Rather than relying on an Instrument Landing System (ILS), the existing ground equipment technology currently used by in-service passenger aircraft in airports around the world where the technology is present, this automatic take-off was enabled by image recognition technology installed directly on the aircraft.

 

Automatic take-off is an important milestone in Airbus’ Autonomous Taxi, Take-Off & Landing (ATTOL) project. Launched in June 2018, ATTOL is one of the technological flight demonstrators being tested by Airbus in order to understand the impact of autonomy on aircraft. The next steps in the project will see automatic vision-based taxi and landing sequences taking place by mid-2020.

 

For autonomous technologies to improve flight operations and overall aircraft performance, pilots will remain at the heart of operations. Autonomous technologies are paramount to supporting pilots, enabling them to focus less on aircraft operation and more on strategic decision-making and mission management.

 

New method could help with takeoff and landing for autonomous taxis, cargo carriers

“Everyone is facing the same problem with weight in creating these types of vehicles,” said Lizhi Shang, a postdoctoral research assistant who works on the technology with Andrea Vacca, a professor of agricultural and biological engineering at Purdue. “Drones require heavy batteries or lots of electrical components, which leaves little room for the actual payload.” Shang said many current systems also are expensive, unstable, unreliable and not environmentally friendly. Shang and the research team at Purdue University came up with a method to use fluid power technology for VTOL AAV.

 

The Purdue team members said their technology is an inexpensive, recyclable hydraulic propulsion system for the multi-rotor VTOL aircraft. The propulsion system uses hydrostatic transmission, a lighter weight and more reliable option, to distribute engine power throughout the rotors, providing thrust for the aircraft and allowing the rotors to each spin at different speeds.

 

The speed of each motor can be controlled individually with faster response by the flight controller or human operator and can run at constant speeds, extending the engine lifetime. This provides both aerodynamic lift and attitude control, eliminating the need for an additional moving control surface or weight shifting device and resulting in a more stable flight and more useful load.

 

“The critical advantage of this innovation is that it’s lightweight, which then can be translated as superior payload fraction, lower operation cost, longer flight distance and better controllability and maneuverability,” Shang said. “For transmitting the same power with precise speed control, a hydraulic system is much lighter than an electric system, which is currently dominating the market.”

 

Raytheon wins $255M US Navy Contract for precision approach and landing

Raytheon Co. (RTN) has been awarded a $255 million contract by the U.S. Navy for development and production readiness of [the U.S. Navy’s] next generation precision landing system. RTN said it will provide the U.S. Navy with a Joint Precision Approach and Landing System (JPALS) that leverages GPS satellite navigation “to provide more accurate landing guidance for manned and unmanned aircraft, replacing radar and beacons used in older systems.

 

“RTN stated JPALS technology “improves navigational alignment prior to approach, allowing aircraft to land on any aircraft carrier or amphibious assault ship, day or night, even in adverse weather conditions. As such, RTN will supply the U.S. Navy with JPALS technology for both manned and unmanned aircraft, RTN noted.

 

Schiebel and Raytheon partner for Australia’s LAND129 Phase 3 project

Schiebel Pacific has partnered with and Raytheon Australia for the LAND129 Phase 3 Tactical Unmanned Aerial System (TUAS) project. Last week, the companies submitted tender response for the Australian Army’s project. The partnership will combine Schiebel’s CAMCOPTER S-100 UAS and Raytheon Australia’s experience as a prime systems integrator across various domains.

 

Together, Schiebel and Raytheon will deliver a solution that provides a low risk offering to establish a sovereign TUAS capability.
The Vertical Take Off and Landing (VTOL) UAS CAMCOPTER S-100 is used for intelligence, surveillance and reconnaissance (ISR) missions. It allows operation from confined areas with no requirement of area or supporting equipment. The S-100 requires only 20 minutes to prepare for deployment and can be operated day and night for up to eight hours.

 

Lockheed Martin to develop autonomous take-off and Landing technologies for Unmanned Aircraft

The office of Naval Research has awarded a $13.5 million contract to an industry team led by Lockheed martin to develop autonomous technologies for take-off and Landing for aircraft. Some of the autonomous technologies have already been demonstrated by K-max, an unmanned aircraft with marines which is being currently used in Afghanistan. The developed technologies under this five year contract shall also be useful as decision aid to pilots on legacy manned aircrafts.

 

Under the contract, Lockheed Martin and a team of industry, government and academic partners shall develop technology that will interact with human operator at a high level while the low level control shall be handled autonomously. The team shall demonstrate in first 18 months the capabilities of its Open-Architecture Planning and Intelligence Architecture for Managing Unmanned Systems (OPTIMUS) . This architecture is designed to be platform agnostic and was developed under Army’s Autonomous Technologies for Unmanned Air System program

US AFRL conducts debut flight of ROBOpilot unmanned air system

The US Air Force Research Laboratory (AFRL) and DZYNE Technologies have conducted the maiden flight test of ROBOpilot unmanned air system.The two-hour flight test of the Robotic Pilot Unmanned Conversion Program was conducted at Dugway Proving Ground in Utah.

 

AFRL Commander major general William Cooley said: “This flight test is a testament to AFRL’s ability to rapidly innovate technology from concept to application in a safe build-up approach while still maintaining low cost and short timelines.”

 

ROBOpilot has been designed, built and tested by AFRL and DZYNE under a Direct to Phase II Small Business Innovative Research (SBIR) contract. It converts existing manned aircraft into autonomous unmanned aircraft without the need for permanent modifications.

 

The system can carry out the standard actions performed by a human pilot, including operating the yoke, rudders and brakes, throttle, switches, and read the dashboard gauges. Additionally, ROBOpilot uses sensors such as GPS and an inertial measurement unit, for situational awareness and information gathering.

 

The details are gathered and analysed by an on-board computer with autopilot. The computer will make informed decisions regarding flight control. AFRL Center for Rapid Innovation senior scientist Dr Alok Das said: “Imagine being able to rapidly and affordably convert a general aviation aircraft, like a Cessna or Piper, into an unmanned aerial vehicle, having it fly a mission autonomously, and then returning it back to its original manned configuration.”

 

Prior to the maiden flight, engineers tested the initial concept in a RedBird FMX simulator. The system completed simulated autonomous take-offs, mission navigation and landings.  The ROBOpilot capability is planned to be used to carry out a range of missions such as cargo delivery, entry into hazardous environments, and intelligence, surveillance and reconnaissance (ISR) missions.

 

 

 

Electronic System Laboratory’s Autonomous aircraft landing under crosswind conditions

Electronic System Laboratory under Department of Electrical and Electronic Engineering at Stellenbosch University in South Africa has completed many projects on unmanned aerial vehicles, including autonomous take-off and landing of fixed-wing and rotor aircraft.

 

The main contribution of this project is the development of a robust control system with excellent disturbance rejection capabilities that will allow for accurate landing of a fixed-wing aircraft under crosswind conditions. Control system techniques that were implemented in previous projects at the ESL were not explicitly designed to land an aircraft under adverse atmospheric conditions; rather, landing tests were conducted during ideal or close-to-ideal wind conditions. This project will therefore focus on the development of robust controllers implemented in a way that exploits the advantages of various crosswind landing techniques.

 

 L3 Communication’s Viking 400-S

The L3 Viking 400-S Unmanned Aircraft System (UAS) is integrated with Autonomous Take-Off and Landing (ATOL) technology supplied by L3 Unmanned Systems’ flightTEK system. The UAS operates for up to 12 hours and can be equipped with up to 100 pounds of payload technologies, including chemical, biological, radiological and nuclear (CBRN) detectors to protect officers responding to a man-made incident, such as a dirty bomb detonation. UAS payloads carrying high-resolution cameras can capture bird’s-eye images of a an incident, which can help commanders identify a suspect’s location and deploy resources with full-situational awareness, such as whether a suspect is armed or hard-hit areas and prioritize resources. Images captured are transmitted wirelessly back to into a GIS software suite for mapping an affected area and later reporting needs. 

 

References and Resources also include:

http://www.darpa.mil/news-events/2015-12-28

http://gtri.gatech.edu/casestudy/autonomous-technology-research-developing-increasi

http://www.darpa.mil/program/tactically-exploited-reconnaissance-node

http://www.chinatopix.com/articles/93902/20160629/new-darp-drone-will-transform-navy-warships-aircraft-carriers.htm

https://www.purdue.edu/newsroom/releases/2019/Q2/using-the-power-of-fluid-for-the-future-of-drones,-flight.html

https://www.sciencedaily.com/releases/2019/05/190524130220.htm

https://www.researchgate.net/publication/269299865_A_survey_of_autonomous_landing_techniques_for_UAVs

 

About Rajesh Uppal

Check Also

Artificial Intelligence (AI) to Enhance Military Intelligence: A New Frontier in Defense Technology

In the rapidly evolving landscape of defense technology, artificial intelligence (AI) is emerging as a …

error: Content is protected !!