The human eye is sensitive only to wavelengths between 400 and 700 nm which is known as the visible spectrum. Humans can perceive a variety of colors ranging from violet to red. Wavelengths however can also be shorter (ultraviolet) or longer (infrared) than those of our visible eyesight.
Multispectral imaging captures light from a narrow range of wavelengths across the electromagnetic spectrum. In Multispectral imaging one obtains images corresponding to at least a couple of spectral channels – sometimes more than ten. The used spectral regions are often at least partially outside the visible spectral range, covering parts of the infrared and ultraviolet region. For example, a multi-spectral imager may provide wavelength channels for near-UV, red, green, blue, near-infrared, mid-infrared and far-infrared light – sometimes even thermal radiation (→ thermal imaging).
While the traditional digital camera captures the light that falls onto the sensor camera and uses wideband filters to divide the light into three channels: red, green and blue (RGB), in a same way that your eye perceives color. A multispectral camera, on the other hand, captures information that is neither available to the human observer, nor to a typical RGB camera. Multispectral images are captured either with special cameras that separate these wavelengths using filters, or with instruments that are sensitive to particular wavelengths, including light from frequencies that are invisible to the human eye (infrared and ultra-violet, for example).
Every surface reflects back some of the light that it receives. Objects having different surface features reflect or absorb the sun’s radiation in different ways. The ratio of reflected light to incident light is known as reflectance and is expressed as a percentage. The reflectance properties of an object depend on the particular material and its physical and chemical state (e.g. moisture), the surface roughness as well as the geometric circumstances (e.g. incidence angle of the sunlight). The most important surface features are color, structure and surface texture. The perceived color of an object corresponds to the wavelength of the visible spectrum with the greatest reflectance. These differences make it possible to identify different earth surface features or materials by analyzing their spectral reflectance patterns or spectral signatures.
While some multispectral imaging devices (also called multispectral cameras) are used on space satellites and airplanes, there are also hand-held devices as well as imaging devices installed in industrial settings, for example.
Multispectral instruments on satellites are used for various kinds of Earth monitoring from space, for example for geological surveys, for environmental monitoring and for military surveillance. Various wavelengths channels can be used for monitoring vegetation (e.g. agricultural crops, biomass mapping), while others are useful for detecting minerals, non-authorized land use, buildings, etc.
Smaller regions on Earth can be monitored with instruments on airplanes or drones, which allow for higher spatial resolutions. The purposes can be similar is for satellite instruments, for example monitoring the development of agricultural crops or the detection of forest fires.
Multispectral Imaging Applications
Earth observation satellites carry a number of instruments for measuring infrared and ultraviolet radiation. Multispectral imaging combines between two and five spectral imaging bands into a single optical system.
Multispectral imaging instruments installed on NASA’s Terra and Digital Globe’s Worldview-2 have given researches a new way of observing the earth, especially for agricultural research and mapping of man-made and natural disasters.
Landsat satellites use multispectral sensors help analysts study land use and land cover change, vegetation and agricultural production trends and cycles, water and environmental quality, soils, geology, and other earth resource and science problems. In fact, Landsat has been one of the most important sources of mid-resolution multispectral data globally.
Multispectral imaging is being used in agriculture to manage crops, soil, fertilizing and irrigation more effectively. Multispectral cameras mounted under agricultural drones detect green, red, red and near infrared wavebands to capture visible and invisible images of crops and vegetation. Multispectral imaging helps farmers minimize the use of sprays, fertilizers and irrigation, while increasing the yield from their fields.
Farmers integrate their multispectral images with specialized agriculture software that translates the images into meaningful data. This data includes information about land telemetry, soil condition and crop progress, and helps the farmer to monitor, plan and manage the farm more effectively, which saves time and money, and reduces the use of pesticides.
Hyperspectral imaging could enable precision agriculture by allowing fertilizer, pesticides, herbicides and water to be applied only where needed, saving water and money and reducing pollution. Imagine a hyperspectral camera mounted on a drone mapping a field’s condition and transmitting that information to a tractor designed to deliver fertilizer or pesticides at variable rates across the fields.
It is estimated that the process currently used to produce fertilizer accounts for up to two percent of the global energy consumption and up to three percent of global carbon dioxide emissions. At the same time, researchers estimate that 50 to 60 percent of fertilizer produced is wasted. Accounting for fertilizer alone, precision agriculture holds an enormous potential for energy savings and greenhouse gas reduction, not to mention the estimated $8.5 billion in direct cost savings each year, according to the United States Department of Agriculture.
The earliest and most successful uses of multispectral imaging were in diagnostic medicine. Multispectral imaging lets healthcare providers pinpoint the presence of diseases that are hard to identify with other means. Eventually, multispectral imaging was combined with nanotechnology to diagnose health issues at the level of individual cells.
Light interacts with biological tissue in different ways, depending on the wavelength of the light. This makes spectral multispectral imaging a powerful tool for biomedical and chemical applications. For example, images captured in the near infrared wavelength help doctors take depth measurements in tissue and blood chromophores such as oxy-hemoglobin, deoxy-hemoglobin and bilirubin. Spectral imaging has the added benefit of being non-invasive, which makes it useful in assessing burns and skin inflammation.
Multispectral imaging is also being used to analyse forensic evidence at crime scenes and also in the laboratory. conditions or directly at the crime scene. Multispectral imaging is valuable to forensic teams because it is a non-contact, non-destructive way to analyse evidence. Also, it requires no sample preparation, which ensures the integrity of evidence. Multispectral imaging is used for forensic analysis of fingerprints, bloodstains, inks, powder residues and documents.
Multispectral imaging is used to collect data from dangerous and inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths.
Detecting Algal Blooms
Multispectral imaging is being used from space to map and monitor algal blooms in coastal waters. Algal blooms are a rapid growth of microscopic algae or cyanobacteria in water, often resulting in a colored scum on the surface. The science of detecting algal blooms with multispectral imaging is in its infancy because of the presence of suspended sediments and dissolved organic matter, which interfere with the images.
Surveying & Mapping
Traditional single-wavelength systems are good at revealing where things are on the ground, they are less capable of determining what those things are. Multispectral imaging allows surveyors to compare a target’s unique reflectance response for each wavelength emitted, providing more detailed analysis and enhanced target discrimination.
Military Target Tracking
Many of the threats facing today’s military are unconventional. Enemies typically have an advantage on their local terrain, an advantage that must be monitored from a distance. Multispectral imaging has a wide variety of military applications, including locating improvised explosives, discovering enemy movements at night time, and measuring the depth of hidden bunkers.
Multispectral imaging is used to detect and track military targets because it measures mid-wave infrared and long-wave infrared. Multispectral imaging measures radiation that’s inherent to an object, regardless of the presence of any external light source. This type of detection is also known as thermal imaging.
Land Mine Detection
Multispectral imaging is also used to detect land mines and underground missiles by analyzing the emissivity of ground surfaces. Drones flown over former battlefields use a camera that acquires registered images in six spectral bands.These images are then analyzed using software that identifies metal and plastic land mines.
Soil on the surface and soil beneath the surface possess different physical and chemical properties that can be detected with multispectral analysis. Disturbed soil features increased emissivity in a specific wavelength, and analyzing images of this soil helps military commanders to identify likely locations of land mines. Detection of recently buried land-mines and improvised explosive devices using multispectral imaging is a growing field.
Ballistic Missile Detection
Ballistic missile defence systems detect, track and intercept enemy ballistic missiles. The system consists of a ballistic-missile warning system, a target discrimination system, an anti-ballistic-missile guidance system, and a command-control communication system. Multispectral imaging is used in the detection stage. Intercontinental ballistic missiles emit invisible radiation during their boost phase. Multispectral imaging detects the body of the missile body (mid-wave infrared) as well as the rocket plumes (long-wave infrared).
Light-trapping nanocubes drive inexpensive multispectral camera
Researchers at Duke University have demonstrated photodetectors that could span an unprecedented range of light frequencies by using on-chip spectral filters created by tailored electromagnetic materials. The combination of multiple photodetectors with different frequency responses on a single chip could enable lightweight, inexpensive multispectral cameras for applications such as cancer surgery, food safety inspection and precision agriculture.
A typical camera only captures visible light, which is a small fraction of the available spectrum. Other cameras might specialize in infrared or ultraviolet wavelengths, for example, but few can capture light from disparate points along the spectrum. And those that can suffer from a myriad of drawbacks, such as complicated and unreliable fabrication, slow functional speeds, bulkiness that can make them difficult to transport, and costs up to hundreds of thousands of dollars.
In research appearing online on November 2019 in the journal Nature Materials, Duke researchers demonstrate a new type of broad-spectrum photodetector that can be implemented on a single chip, allowing it to capture a multispectral image in a few trillionths of a second and be produced for just tens of dollars. The technology is based on physics called plasmonics—the use of nanoscale physical phenomena to trap certain frequencies of light.
“The trapped light causes a sharp increase in temperature, which allows us to use these cool but almost forgotten about materials called pyroelectrics,” said Maiken Mikkelsen, the James N. and Elizabeth H. Barton Associate Professor of Electrical and Computer Engineering at Duke University. “But now that we’ve dusted them off and combined them with state of the art technology, we’ve been able to make these incredibly fast detectors that can also sense the frequency of the incoming light.”
According to Mikkelsen, commercial photodetectors have been made with these types of pyroelectric materials before, but have always suffered from two major drawbacks. They haven’t been able to focus on specific electromagnetic frequencies, and the thick layers of pyroelectric material needed to create enough of an electric signal have caused them to operate at very slow speeds.
“But our plasmonic detectors can be turned to any frequency and trap so much energy that they generate quite a lot of heat,” said Jon Stewart, a graduate student in Mikkelsen’s lab and first author on the paper. “That efficiency means we only need a thin layer of material, which greatly speeds up the process.”
The previous record for detection times in any type of thermal camera with an on-chip filter, whether it uses pyroelectric materials or not, was 337 microseconds. Mikkelsen’s plasmonics-based approach sparked a signal in just 700 picoseconds, which is roughly 500,000 times faster. But because those detection times were limited by the experimental instruments used to measure them, the new photodetectors might work even faster in the future.
To accomplish this, Mikkelsen and her team fashioned silver cubes just a hundred nanometers wide and placed them on a transparent film only a few nanometers above a thin layer of gold. When light strikes the surface of a nanocube, it excites the silver’s electrons, trapping the light’s energy—but only at a specific frequency.
The size of the silver nanocubes and their distance from the base layer of gold determine that frequency, while the amount of light absorbed can be tuned by controlling the spacing between the nanoparticles. By precisely tailoring these sizes and spacings, researchers can make the system respond to any electromagnetic frequency they want.
To harness this fundamental physical phenomenon for a commercial hyperspectral camera, researchers would need to fashion a grid of tiny, individual detectors, each tuned to a different frequency of light, into a larger ‘superpixel’.
In a step toward that end, the team demonstrates four individual photodetectors tailored to wavelengths between 750 and 1900 nanometers. The plasmonic metasurfaces absorb energy from specific frequencies of incoming light and heat up. The heat induces a change in the crystal structure of a thin layer of pyroelectric material called aluminium nitride sitting directly below them. That structural change creates a voltage, which is then read by a bottom layer of a silicon semiconductor contact that transmits the signal to a computer to analyze.
“It wasn’t obvious at all that we could do this,” said Mikkelsen. “It’s quite astonishing actually that not only do our photodetectors work, but we’re seeing new, unexpected physical phenomena that will allow us to speed up how fast we can do this detection by many orders of magnitude.”
Mikkelsen sees several potential uses for commercial cameras based on the technology, because the process required to manufacture these photodetectors is relatively fast, inexpensive and scalable.
Surgeons might use multispectral imaging to tell the difference between cancerous and healthy tissue during surgery. Food and water safety inspectors could use it to tell when a chicken breast is contaminated with dangerous bacteria.
With the support of a new Moore Inventor Fellowship from the Gordon and Betty Moore Foundation, Mikkelsen has set her sights on precision agriculture as a first target. While plants may only look green or brown to the naked eye, the light outside of the visible spectrum that is reflected from their leaves contains a cornucopia of valuable information.
“Obtaining a ‘spectral fingerprint’ can precisely identify a material and its composition,” said Mikkelsen. “Not only can it indicate the type of plant, but it can also determine its condition, whether it needs water, is stressed or has low nitrogen content, indicating a need for fertilizer. It is truly astonishing how much we can learn about plants by simply studying a spectral image of them.”
Several companies are already pursuing these types of projects. For example, IBM is piloting a project in India using satellite imagery to assess crops in this manner. This approach, however, is very expensive and limiting, which is why Mikkelsen envisions a cheap, handheld detector that could image crop fields from the ground or from inexpensive drones.
“Imagine the impact not only in the United States, but also in low- and middle-income countries where there are often shortages of fertilizer and water,” said Mikkelsen. “By knowing where to apply those sparse resources, we could increase crop yield significantly and help reduce starvation.”