“Oceans cover more than 70 percent of the earth’s surface, but we know very little about them,” said Ersin Uzun, vice president and general manager of the Internet of Things team at Xerox.
Floating sensors, known as floats, can gather far more detailed information, and can remain at sea for months at a time. There is a network of almost 4,000 Argo science floats around the world, gathering data on ocean temperature and salinity, that uses several thousand battery-powered, robotic floating devices to measure temperature, salinity and current for climate and oceanographic research. The floats mostly drift 10 days at a time below the ocean surface. After rising and transmitting their data to satellites, they return to depth to drift for another 10 days. The floats go as deep as 2,000 meters, according to the Argo website.
DARPA launched OoT program which can be considered as one of the applications of the Internet of Military Things (IoMT) in the maritime and underwater domain for the purposes of reconnaissance, surveillance, and other combat-related objectives.
Each smart float would contain a suite of commercially available sensors to collect environmental data—such as ocean temperature, sea state, and location—as well as activity data about commercial vessels, aircraft, and even maritime mammals moving through the area. The data will be processed by cloud for detecting, identifying and tracking ships. However, the accuracy of these algorithms will depend on knowing the present and future locations of floats in the ocean.
As part of the Ocean of Things program – which uses low-cost distributed drifters for maritime situational awareness – DARPA is hosting a challenge in Sep 2021 called Forecasting Floats in Turbulence, or FFT. The challenge is designed to spur development of algorithms to better predict where free-drifting floats will travel over time.
Challenges of Modelling Ocean currents
Because more than 70 percent of the world’s surface is covered by oceans, the behavior of their currents plays a huge role in the global climate. For oceanographers and climatologists, predicting climate change depends on understanding the oceans’ contribution.
It may come as a surprise that even with precise weather forecasting, supercomputers, satellites, and exquisite maritime sensing instruments we know very little about how currents move on the surface of the ocean. Models exist for subsurface ocean currents and weather activity above the surface, but the air-ocean interface is not a fully understood realm.
It’s no small job. Ocean currents are fiendishly complex. In the real world, ocean eddies – microcurrents that cumulatively add up to major trends – can be as small as 1 km and as wide as 25 km. But because of the sheer complexity of ocean systems, it takes lots of computing power to build accurate models. Computing limitations mean FESOM and other ocean models only calculate currents down to a resolution of 100 km, and often less.
The most reliable spatially continuous estimates of global surface currents in the ocean come from geostrophic balance applied to the sea surface height (SSH) field observed by satellite altimeters. For the most part, the dynamics of slow, large-scale currents (up to the mesoscale) are well-approximated by geostrophic balance, leading to a direct relationship between gradients of SSH and near-surface currents.
However, current meter observations for the past few decades and some of the newer generation ultra-high-resolution numerical model simulations indicate the presence of an energized submesoscale as well as high-frequency waves/tides at smaller spatial and temporal scales (Rocha et al., 2016). In addition, the next generation of satellite altimeters like the upcoming Surface Water and Ocean Topography (SWOT) mission (Morrow et al., 2018) is going to capture the ocean surface at a much higher spatial resolution, but with a low frequency repeat cycle (21 days).
This presents unique challenges for the estimation of surface currents from SSH using traditional balances like geostrophy or Ekman. The high-wavenumber SSH variability is likely to be strongly aliased in the temporally sub-sampled data and may represent an entirely different, ageostrophic regime, where geostrophy might not be the best route to infer velocities.
“There’s currently no way to accurately predict where the proverbial message in a bottle thrown overboard in the open sea will be in a week’s time, let alone where it will eventually wash ashore,” said John Waterston, program manager for Ocean of Things in DARPA’s Strategic Technology Office. “Meso-scale models of ocean currents can give us a general direction the bottle will most likely drift, but we don’t have a good understanding of the sub-mesoscale features that the bottle could get caught in, and drift in a completely new direction. The same thing holds true for the thousands of small, low-cost floats that form a distributed sensor network in Ocean of Things.”
US researchers Anirban Sinha and Ryan Abernathey have explored statistical models based on machine learning (ML) algorithms for inferring surface currents from satellite observable quantities like SSH, wind and temperature in their study. These algorithms can offer a potential alternative to the traditional physics-based models. We should point out that resolving the issues pertaining to spatio-temporal sampling and interpolation in satellite altimetry or the separation of balanced and unbalanced flows, while being important problems, are beyond the scope of our present study.
Sonal Rami is also researching at the MarDATA will help to predict these currents more accurately. By using machine learning methods, she is working on making complex ocean models even more precise – to enable more reliable predictions of climate change. She’s focused on a technique called interpolation, which is like drawing a picture by connecting dots: Rather than draw all the lines each time the model is computed, Rami is teaching the computer to predict the movement of ocean currents faster and more efficiently based on a few data points. “Ideally,” Rami says, “we don’t need to store the middle data.”
Being able to accurately model the ocean’s surface currents could yield a variety of benefits such as more accurately predicting movement of an oil spill, supporting search and rescue operations for a man overboard or vessel adrift, and could even help with planning more efficient global shipping routes by circumnavigating areas with adverse currents.
Here’s how the competition will work: Starting with a training data set, DARPA will provide 20-days’ worth of historical drift data from a field of commercially available Spotters produced by Sofar Ocean, a performer on Ocean of Things. With roughly 90 Spotters circulating in the Atlantic, and 20 days of data, participants will need to train their algorithm or technique to predict where these spotters will be in 10 days.
DARPA and Sofar Ocean will provide the Spotters and data, but the data assimilation, statistical analysis, algorithms or methods are up to participants. Once competitors submit their prediction for the location of the 90 floats, DARPA and Sofar will provide float location updates every two days via web-based interface so competitors can see how closely the actual float locations are tracking against their 10-day prediction. DARPA will award a total of $50,000 in the FFT challenge: $25,000 for first place, $15,000 for second place, and $10,000 for third place.
“The dynamics of the ocean surface is critical for global weather and climate, but inherently complex and difficult to predict. We created a planetary-scale distributed network of Spotters to help change this and open pathways to improve understanding,” Sofar CEO Tim Janssen said. “It’s incredibly exciting for us to partner with the team at DARPA to challenge teams to use this data to develop new approaches to accelerate ocean discovery and research. May the best idea win!”