Home / Technology / AI & IT / The future of commercial and military aviation is true autonomous flight

The future of commercial and military aviation is true autonomous flight

Modern commercial airliners have automated systems that can augment or even replace pilots’ performance, managing engine power, controlling and navigating the aircraft, and in some cases even completing landings. One of the goal of future aviation is fully autonomous flight. A fully autonomous aircraft would not require a pilot; it would be able to operate independently within civil airspace, interacting with air traffic controllers and other pilots just as if a human pilot were on board and in command.

 

Several companies are developing fully autonomous aircraft, including Amazon and UPS, which want to use them for deliveries. These new planes are variously described as flying taxis, passenger drones or, as the industry terms them, urban air mobility (uam) vehicles.  Around 200 such craft are at various stages of development around the world, according to experts at Farnborough’s first global urban air summit in early September. Some prototypes are already carrying out test flights and operators hope to begin commercial services within the next few years.

 

Uber, which runs an app-based taxi-hailing service, aims to start flying passengers in Dallas, Los Angeles and Melbourne, Australia by 2023. Boeing and Airbus are designing self-flying air taxis, which would be used for flights of about 30 minutes and carry between two and four passengers, and have tested prototypes. A company called Volocopter has been testing autonomous air taxis in Germany since 2016 and plans to conduct test flights in downtown Singapore.

 

The development and application of increasingly autonomous (IA) systems for civil aviation are proceeding at an accelerating pace, driven by the expectation that such systems will return significant benefits in terms of safety, reliability, efficiency, affordability, and/or previously unattainable mission capabilities.

 

These development is being driven in many advancements of military and civil aircraft technologies including  high-capability computing systems; sensor technologies; high-throughput digital communications systems; precise position, navigation, and timing information (e.g., from the Global Positioning System (GPS)  and open-source hardware and software.

 

Airlines and manufacturers say they would save money and alleviate the current shortage of qualified pilots if they could reduce–or even eliminate–the number of pilots in the cockpit. Redesigning the front of the aircraft to be more aerodynamic could save even more money, if it didn’t need room for pilots, or could move them to another part of the aircraft. Large commercial airplanes will likely go pilotless later than smaller private aircraft, because of the amount of time and money required to produce them. But smaller air taxis simply are not economically viable if they require a human pilot on board.

 

Landing a plane under autopilot, known as autoland, is a different matter. According to a study by Boeing in 2017, 49% of fatal plane accidents between 2008 and 2017 occurred during final approach and landing. By removing possibilities for human error through automation, the risk of accidents can be reduced to make these phases safer. ‘If we look to the recent root causes analysis of aircraft accidents, many of them have a large contribution from human error,’ said Deschacht.  While some systems already exist, efforts are underway to improve them to enable safer landings.

Autonomous Aircrafts

At the highest level, autonomy implies the ability of the system (often a machine) to perform tasks that involve dynamically executing a “decision cycle” in much the same fashion as a human. The simplest model is the so-called OODA loop, an acronym  that stands for the four-step process used to execute virtually any task: Observe, Orient, Decide, and Act.

 

An autonomous system, be it human or machine, first observes by sensing or acquiring information about the environment from other relevant sources. It then orients itself toward the task at hand. This second step infers a number of functions that can encompass information fusion, contextual interpretation, the integration of learned behaviors, and even inferences about future events. In the robotics community, a number of the capabilities associated with this step are often referred to in aggregate as perception.

 

The third step then involves making a decision based on the task objectives and the results of the prior steps. This requires that the system be capable of implementing an appropriate action that accomplishes the task. Once the action is complete, the cycle repeats as the system observes the consequences of the action as well as changes in the environment caused by other factors.

 

The OODA concept of autonomy applies to tasks that range from lower-level functions, such as stabilization and basic maneuvering of an aircraft (e.g., a fly-by-wire control system), to high-level mission decisions and even to the accomplishment of a complete mission. Some of these capabilities exist today, while others will be possible only as IA technologies mature over time.

 

Autoland systems

By some estimates about 1% of all commercial flights use autoland, using an Instrument Landing System (ILS). Using ILS requires crosswinds of less than 46km per hour, comparable to a strong breeze, and becomes harder in adverse visibility conditions such as fog.

 

Modern autoland systems have other limitations. They require significant ground infrastructure in order to support fully automated landings. The runway must be equipped with radio beacons, which send signals to the aircraft to allow it to obtain accurate and reliable position information. Such systems are expensive, with few airports supporting them, while nearby obstacles like mountains make them unusable.

 

Heikki Deschacht from avionics manufacturer ScioTeq in Belgium is the coordinator for IMBALS, a project that’s developing what’s called the Vision Landing System (VLS). The goal of this system is to enable large passenger planes to land automatically with less need for ground-based radio beacons.

 

‘The end goal of the IMBALS project is to realise and validate and verify a vision-based landing system for large passenger aircraft,’ said Deschacht. This system will consist of an onboard camera system that captures images in front of the aircraft and an imaging processing platform which extracts position information to help the autopilot steer the plane to the runway, he explains.

 

‘Only 60% of the airports being served with Airbus aircraft are equipped with ILS (ground infrastructure),’ said Deschacht. ‘And not all of those are sufficient to do autolanding. So there’s a big gap in the airports (where) autolanding is simply not possible. And that’s the gap we wanted to fill with a vision-based landing system, because we don’t rely on anything on the ground. The only thing we need are visibility conditions (that make) the runway visible for the camera sensors.’

 

He also notes that the image processing technology like VLS could eventually be used in other phases of flight too, such as take-off and taxi. ‘It’s quite a challenging job for the pilot if it’s quite a large aircraft and it’s a small (or busy) airport and there’s low visibility,’ he said. ‘It’s not a vehicle you can easily turn back if you have mistaken your exit somewhere from the taxiway or runway.’

 

Aircraft lands itself truly autonomously for the first time

Many airliners can land automatically, but they don’t really land autonomously — the airport is guiding them in with a radio signal (the Instrument Landing System). And when many smaller airports don’t have this feature, it’s not even an option. Researchers at Technische Universität München might just make true autonomous landing a practical reality, though. They’ve successfully tested a system that uses a combination of computer vision and GPS to have the aircraft land itself.

 

The technology uses GPS to navigate, but allies that with both visible light and infrared cameras to spot the runway and obtain an accurate sense of its position even when fog or rain hurts visibility. From there, the aircraft can calculate a glide path and otherwise touch down all on its own.

 

The project is still young, but it’s promising. A test landing in late May went as well as you could hope. The aircraft recognized the runway from a long distance and landed on the centerline without the pilot once taking control. If it’s refined enough, the system could make hands-free landings feasible at virtually any airfield, not to mention give pilots a backup. This also lays some groundwork for end-to-end autonomous flight that might only require supervision for complete trips.

 

 

 

 

 

 

 

 

About Rajesh Uppal

Check Also

Opportunities and Challenges of Applying AI/ML & Deep Learning Technologies in Military

In recent years, the integration of artificial intelligence (AI), machine learning (ML), and deep learning …

error: Content is protected !!