Trending News
Home / Technology / AI & IT / AI for space and satellite missions

AI for space and satellite missions

Today’s satellite payloads have the capability to capture a greater volume of data than ever before. However, this is placing increasing demands on a satellite operator’s data management, storage, and processing systems. The process of downlinking satellite data to the ground also faces bottlenecks; from the fundamental limits of satellite passes and ground station coverage to issues of interoperability between ground segment systems and end-user applications.

 

Traditionally, electro-optical imagery and synthetic-aperture radar data have been sent to the ground for processing. That’s still largely the case, but new Earth-observation sensors continue expanding the volume of data acquired in orbit, sometimes quite dramatically. At the same time, customers are eager for speedy access to insights drawn from various datasets.

 

Weather observation is a good example. Numerical weather models merge vast quantities of data drawn for space, airborne, maritime and terrestrial sensors. AAC Clyde Space, the Swedish company supplying the core avionics for the European Space Agency’s Arctic Weather Satellite, sees improvements in onboard processing as a way to speed up weather data delivery.

 

“We see an opportunity in the future to do a lot of processing on board: preparing data, compressing data and starting to fuse data,” said Luis Gomes, AAC Clyde Space CEO. “Our objective is real-time weather observations from space. For that, we need to package the data efficiently and effectively to reduce the amount of time that we are downlinking.” Hyperspectral sensors also produce huge datasets that make onboard processing “quite critical,” Gomes said.

 

The On-Board Computer (OBC) of a satellite is a very important sub-system responsible for ob board processing. It is responsible for control of the payload and other sub-systems like power generation, attitude control and communication. Due to the importance of this sub-system to the mission and rigours of operating in the hazardous space environment, there are some particular design requirements for on-board computers. At the small satellite level, there are added constraints because of the required level of miniaturisation and energy-efficiency. All these make the design of an OBC for pico satellites a challenge.

 

Like the rest of the components of a satellite, OBCs also have an operating lifetime in space. This lifetime is mainly dependent on the radiation level of the space environment. OBCs’ electronic components are key factors to estimate the mission duration. For short duration missions COTS components can be used, but for longer duration missions, radiation-hardened components need to be involved in order to make satellite lifetimes longer. Alongside high-quality electronic components, the reliable and robust design of an OBC is also very important in considering mission lifetime.

 

The use of artificial intelligence (AI) capabilities and other advanced on-orbit data processing technologies are offering solutions to these problems for smallsat missions and services. New artificial intelligence technology could speed up physical fault diagnosis in spacecraft and spaceflight systems, improving mission efficiency by reducing downtime.

 

 

 

Intel Powers First Satellite with AI on Board in 2020

In Oct 2020, ESA reported about PhiSat-1 which contained a new hyperspectral-thermal camera and onboard AI processing thanks to an Intel® Movidius™ Myriad™ 2 Vision Processing Unit (VPU) — the same chip inside many smart cameras and even a $99 selfie drone here on Earth. PhiSat-1 is actually one of a pair of satellites on a mission to monitor polar ice and soil moisture, while also testing intersatellite communication systems in order to create a future network of federated satellites.

 

The first problem the Myriad 2 is helping to solve? How to handle the large amount of data generated by high-fidelity cameras like the one on PhiSat-1. “The capability that sensors have to produce data increases by a factor of 100 every generation, while our capabilities to download data are increasing, but only by a factor of three, four, five per generation,” says Gianluca Furano, data systems and onboard computing lead at the European Space Agency, which led the collaborative effort behind PhiSat-1.

At the same time, about two-thirds of our planet’s surface is covered in clouds at any given time. That means a whole lot of useless images of clouds are typically captured, saved, sent over precious down-link bandwidth to Earth, saved again, reviewed by a scientist (or an algorithm) on a computer hours or days later — only to be deleted.

 

“And artificial intelligence at the edge came to rescue us, the cavalry in the Western movie,” says Furano. The idea the team rallied around was to use onboard processing to identify and discard cloudy images — thus saving about 30% of bandwidth.

 

“Space is the ultimate edge,” says Aubrey Dunne, chief technology officer of Ubotica. The Irish startup built and tested PhiSat-1’s AI technology, working in close partnership with cosine, maker of the camera, in addition to the University of Pisa and Sinergise to develop the complete solution. “The Myriad was absolutely designed from the ground up to have an impressive compute capability but in a very low power envelope, and that really suits space applications.”

 

ESA announced the joint team was “happy to reveal the first-ever hardware-accelerated AI inference of Earth observation images on an in-orbit satellite.” By only sending useful pixels, the satellite will now “improve bandwidth utilisation and significantly reduce aggregated downlink costs” — not to mention saving scientists’ time on the ground.

 

Looking forward, the usages for low-cost, AI-enhanced teensy satellites are innumerable — particularly when you add the ability to run multiple applications. “Rather than having dedicated hardware in a satellite that does one thing, it’s possible to switch networks in and out,” says Jonathan Byrne, head of the Intel Movidius technology office. Dunne calls this “satellite-as-a-service.”

 

Military Satellites will also require AI to autonomously protect the satellites from anti satellite weapons and other threats. As new emerging threats require faster than human-in-the-loop response times, next-generation defense systems are requiring more autonomy, data processing, and decision making at the edge. These new systems are looking to artificial intelligence and machine learning (AI/ML) to provide higher levels of autonomous command and control.

 

For space systems, onboard processing of advanced AI/ML algorithms, especially deep learning algorithms, requires a multiple magnitude increase in compute capability compared to what is available with legacy, radiation-tolerant, space-grade processors on current space vehicles. The next generation of space processors for AI/ML onboard will likely include a diverse landscape of heterogeneous systems including various combinations of CPUs, GPUs, FPGAs, and purpose-built ASICs.

 

NASA AI Technology Could Speed up Fault Diagnosis Process in Spacecraft, reported in Nov 2021

Research in Artificial Intelligence for Spacecraft Resilience (RAISR) is software developed by Pathways intern Evana Gizzi, who works at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. With RAISR, artificial intelligence could diagnose faults real-time in spacecraft and spaceflight systems in general.

“The spacecraft reporting a fault is like a car with a check engine light on,” Gizzi said. “You know there is an issue, but can’t necessarily explain the cause. That’s where the RAISR algorithm comes in, diagnosing the cause as a loose gas cap.”

 

Right now, the ability to make inferences about what is happening that go beyond traditional ‘if-then-else’ fault trees is something only humans can do, Gizzi said. Current fault tree diagnosis depends on the physics being simple and already known to engineers and scientists. For instance, if an instrument’s temperature drops too low, the spacecraft can detect this situation and turn on heaters. If the current in a line spikes, the spacecraft may work to isolate the offending circuit. In both cases, the spacecraft simply knows that if ‘A’ happens, respond by doing ‘B.’ What the spacecraft cannot do is figure out what caused these events, especially in unexpected fault cases: whether the spacecraft entered Earth’s shadow or a micrometeoroid damaged a circuit.

 

These types of conclusions require the ability to follow a logical chain of non-trivial inferences – something like human reasoning, Gizzi said. The artificial intelligence (AI) might even be able to connect the spacecraft’s decreased temperature with a malfunction in its internal heat regulation system: an example of a more catastrophic fault.

 

Referring such faults to a human on the ground does not just take time, but costs valuable resources in terms of communications networks and bandwidth for smaller missions in Earth orbit, or even for exploring distant planets, where bandwidth to controllers on Earth is limited by distance.

 

In other circumstances, like orbiting behind another planet or the Moon, contact is simply not available. Computers also excel over human controllers when a proper inference needs to be done extremely fast using several disparate types of data. In its current stages, RAISR would not actively control the spacecraft in any way, but facilitates diagnosis by finding associations that a human may miss.

 

Michael Johnson, the Engineering and Technology Directorate chief technologist at Goddard, said current safe modes waste valuable time because science data collection ceases, whereas a technology that could diagnose and address a fault might lead to a quicker return to normal flight operations.

 

RAISR uses a combination of machine learning and classical AI techniques. While machine learning-based techniques can be particularly useful in diagnosing faults, its performance depends on having a large amount of diverse data, Gizzi said, and therefore usually addresses faults that have happened in the past. With anomalies, which are faults that have never been experienced, there simply may not be enough data to create sound reasoning with machine learning-based techniques. That is where classical AI steps in, Gizzi said, facilitating reasoning in more complicated situations that don’t have previous data to inform decisions.

 

Titan’s success demonstrates the applied AI capabilities critical for conducting space experimentation in April 2022

Titan Space Technologies has successfully deployed and run a suite of machine learning models on the HPE Spaceborne Computer-2, an edge computing and AI system aboard the International Space Station (ISS), in support of Axiom Space’s future missions and vision of smart spacecraft architectures. Working closely with Axiom and HPE, Titan was tasked with applying its space experimentation platform to a use case based on the new demands of a modern space station. Titan’s success demonstrates the applied artificial intelligence (AI) capabilities critical for conducting space experimentation on orbital destinations and spacecraft now and in the future.

 

“As the industry continues to make progress in the rapid path to commercialization in space, optimizing key applications and capabilities on the International Space Station will be essential to support future, mission-critical spacecraft demands,” said Dr. Mark Fernandez, principal investigator of Spaceborne Computer-2, HPE. “We look forward to continue collaborating with Titan Space Technologies and welcome its expertise and technical craftsmanship that is necessary to build required space infrastructure for successful commercial development in space.”

 

“Working closely with HPE to deploy and test large scale machine learning on HPE Spaceborne Computer-2 was the signal I’d hoped for, heralding the potential for applied AI at scale in space.”, said Russell Foltz-Smith, Titan Co-founder and Chief Compute Officer. “The AI ramp-up needed to support the record investment and human activity in LEO is something that HPE and Axiom are uniquely positioned to address.”

 

NASA Engineers Work To Give Satellite Swarms a Hive Mind

Future satellites are likely to operate in swarms, communicating through intersatellite links and working together to capture unique datasets and extend communications networks. Eventually, constellations will employ artificial intelligence to solve problems by, for example, healing or repositioning satellites based on onboard analysis of their health and performance, which will require extensive edge processing, said Chuck Beames, chairman of the SmallSat Alliance, an industry association.

 

Swarms of small satellites could communicate amongst themselves to collect data on important weather patterns at different times of the day or year, and from multiple angles. Such swarms, using machine learning algorithms, could revolutionize scientists’ understanding of weather and climate changes. Engineer Sabrina Thompson is working on software to enable small spacecraft, or SmallSats, to communicate with each other, identify high-value observation targets, and coordinate attitude and timing to get different views of the same target.

 

“We already know that Saharan dust blowing over to the Amazon rainforests affects cloud formation over the Atlantic Ocean during certain times of the year,” said Thompson, who works at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “How do you capture that cloud formation? How do you tell a swarm of satellites what region and time of day is the best to observe that phenomenon?”

 

Under Thompson’s plan, scientists would establish a set of requirements for observations and define high-value targets. Then the software would take over, enabling a spacecraft swarm to figure out how to move relative to one another to best observe these targets. Strategies might also change based on time of day, season, or the region being observed. The spacecraft also would use onboard machine learning to improve viewing strategies over time.

“There are several types of swarm configuration being considered,” Thompson said. “One might be a swarm where satellites will be in different orbits, which will allow them to view a cloud or other phenomenon at different angles. Another swarm could view the same phenomena with similar view, but at different times of the day. A third type of swarm might combine both, with some satellites in the same orbit, following one another with some time offset, and other satellites which may be in orbits with different altitudes and/or inclinations.”

 

While a swarm would stay within the same orbit, individual spacecraft could even use something called differential drag control — manipulating the forces caused by Earth’s atmosphere dragging against the orbiting craft — to control the time separation between each spacecraft relative to others in the swarm, she said. “The length of time it takes to perform a differential drag maneuver depends on the spacecraft mass and area, as well as the orbital altitude. For instance, it can take as long as one year or as short as a couple of days, even hours.”

 

“With multiple spacecraft in one formation to view the same target,” Thompson said, “you can see a cloud, for instance, not just from the top, but from the sides as well.” In a different formation, you can see that cloud at different stages of its life-cycle from multiple SmallSats passing at different times.

 

Working with University of Maryland – Baltimore County (UMBC) professor Jose Vanderlei Martins, Thompson helped develop the Hyper-Angular Rainbow Polarimeter (HARP) CubeSat that launched from the International Space Station (ISS) just over a year ago. An updated version of its instrumentation, called HARP2, will fly on the Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) mission planned for launch in 2023.

 

A swarm of SmallSats like HARP, sharing information and coordinating coverage, could advance weather forecasting, disaster reporting, and climate modeling in the long term, Vanderlei Martins said. To get there, scientists need the combination of wide and narrow fields of view and high-resolution imagery to better understand the dynamics of weather system development.

 

 

References and Resources also include:

https://www.edge-ai-vision.com/2020/10/intel-powers-first-satellite-with-ai-on-board/

https://spacenews.com/living-on-the-edge-satellites-adopt-powerful-computers/

https://scitechdaily.com/nasa-engineers-work-to-give-satellite-swarms-a-hive-mind/

https://www.nasa.gov/feature/goddard/2021/-ai-could-speed-fault-diagnosis-in-spacecraft

 

Cite This Article

 
International Defense Security & Technology (March 29, 2023) AI for space and satellite missions. Retrieved from https://idstch.com/space/ai-for-space-and-satellite-missions/.
"AI for space and satellite missions." International Defense Security & Technology - March 29, 2023, https://idstch.com/space/ai-for-space-and-satellite-missions/
International Defense Security & Technology October 13, 2022 AI for space and satellite missions., viewed March 29, 2023,<https://idstch.com/space/ai-for-space-and-satellite-missions/>
International Defense Security & Technology - AI for space and satellite missions. [Internet]. [Accessed March 29, 2023]. Available from: https://idstch.com/space/ai-for-space-and-satellite-missions/
"AI for space and satellite missions." International Defense Security & Technology - Accessed March 29, 2023. https://idstch.com/space/ai-for-space-and-satellite-missions/
"AI for space and satellite missions." International Defense Security & Technology [Online]. Available: https://idstch.com/space/ai-for-space-and-satellite-missions/. [Accessed: March 29, 2023]

About Rajesh Uppal

Check Also

DARPA SPCE improving efficiency of radiation-tolerant Power convertors for Military LEO Satellites

The space industry is set to expand to over $8.8 billion dollars by 2030 fueled …

error: Content is protected !!