Situation awareness and accurate target identification are critical requirements both for security forces engaged in counterterrorism operations. Ground, airborne and space-borne radars and EO sensors have proved to be of great utility in detection, tracking, and imaging of vehicles to high speed fighter aircrafts, locating mortars and artillery, over the horizon capability and all weather long range surveillance.
However theses sensors are limited in capability to detect targets concealed in the foliage. Conventional forces and terrorists / insurgents have exploited this weakness by employing camouflage, concealment and deception tactics like hiding in camouflage net and forests since long.
LIDAR (Light Detection and Ranging) is an active optical remote sensing sensor, that when flown aboard rotary or fixed-wing aircraft, through illumination of the target scene with laser pulses can acquire x, y, and z coordinates of both manmade and naturally occurring targets and generate 3D models of them.
LIDARs are rapidly gaining maturity as very capable sensors for number of applications such as imaging through clouds, vegetation and camouflage, 3 D mapping and terrain visualization, navigation and obstacle avoidance for robotics as well as weapon guidance.
LIDAR wavelength is much shorter than microwave, hence it is considered incapable of penetrating forest canopy. However even in high density forests there are gaps including gaps inside single tree crown and gaps in between different tree canopies. If portion of laser beam is able to penetrate the gap and hit the target and return to the detector, the LIDAR is able to detect the target hidden under foliage.
Now, researchers at the Naval Research Laboratory in Washington, DC, have developed LIDAR based on gated digital holography, to give LiDAR an enhanced ability to see through obscuring elements like foliage and netting.“see” through minor obstructions for genuine obstacles.
A team at MIT’s Photonic Microsystems Group have integrated LIDAR systems onto a single microchip that can be mass produced in commercial CMOS foundries yielding a potential on-chip LIDAR system cost of about $10 each. Instead of a mechanical rotation system, optical phased arrays with multiple phase controlled antennas emitting arbitrary beam patterns may make devices more robust.
Naval Research Laboratory has developed a foliage penetrating LIDAR
Researchers at Washington, DC’s Naval Research Laboratory have developed a LIDAR and a new methodology based on gated digital holography, to give LiDAR an enhanced ability to see through obscuring elements like foliage and netting.
“This was an attempt to address one of the problems with something called foliage-penetrating LiDAR,” Paul Lebow of the Naval Research Laboratory said. “You can actually use it to detect three-dimensional images behind an obscuration such as a tree canopy… You can illuminate using LiDAR through the leaves and get enough light coming back through to be able to recreate a three-dimensional, topographic view of what’s going on beneath.”
However, over the years, LiDAR measurements have been limited to wherever light can penetrate. With surfaces hidden behind obstructions, the original light gets thrown away and the camera is unable to detect or only receives a limited signal, providing minimal readings.
“We have been working with a process called optical phase conjugation for quite some time and it dawned on us that we might be able to use that process to essentially project a laser beam through the openings of the leaves and be able to see through a partial obscuration,” Lebow said. “It was something that until maybe the last five years was not viable just because the technology wasn’t really there.”
The stuff we had done about 20 years ago involved using a nonlinear optical material and was a difficult process. Now everything can be done using digital holography and computer generated holograms, which is what we do.” This new system uses a specially designed laser that alone took a year and a half to develop, but was a necessary component according to Lebow and his colleague, Abbie Watnik, who is also at the Naval Research Laboratory and another of the work’s authors.
Phase conjugation is a fascinating phenomenon with very unusual characteristics and properties. It operates somewhat like holography, but it is a dynamic hologram, whose “holographic plate” is defined by interfering wave fronts in a nonlinear optical medium, rather than etched as a static pattern on a glass plate. Optical phase conjugation is a non-linear optical process, which is capable of time reversing the scattering process, and healing the distortions in a wave front.
“We send one laser beam out to the target and then it returns, and at the exact same time that return [beam] hits the detector, we interfere it locally with another laser beam,” Watnik said. The researchers needed to design the laser system to ensure the camera captures absolute coherence when the laser beams interfere with one another.
Using a pulsed laser with pulse widths of several nanoseconds, and gated measurements with similar time resolution, the holographic system selectively blocks the earliest-to-arrive light reecting o obscurations. The camera then only measures light coming back from the partially hidden surface below.
“We’ve done this earlier using a CW (continuous-wave) laser as a demo, but now we’re using a pulsed laser and a very fast gated sensor that can turn on at the appropriate time to basically only let us respond to the light coming from where we want it to come from, from the target,” Lebow said. “The laser is designed so that the time difference between the local reference pulse and the signal pulse that comes back from the target is completely adjustable to accommodate distances from a few feet to several kilometers.”
“We were able to verify what our computer model says using our real data – matching it to what we actually see using the spatial light modulator, so I think that was an interesting verification of our results,” Watnik said.
Watnik and Lebow, along with their research team, hope to continue with the project and make the adaptations to their prototype necessary to making the foliage-penetrating LiDAR system field-ready.
DARPA’s Modular Optical Aperture Building Blocks (MOABB)
MIT’s Photonic Microsystems Group is trying to take these large, expensive, mechanical lidar systems and integrate them on a microchip that can be mass produced in commercial CMOS foundries. At MIT our lidar on a chip work first began with the development of 300-mm silicon photonics, making their potential production cost on the order of $10 each at production volumes of millions of units per year.
MIT’s current on-chip LIDAR system can detect objects at ranges of up to 2 meters, though they hope within a year to achieve a 10-meter range. The minimum range is approximately 5 cm, and the team has demonstrated centimeter longitudinal resolution and expect 3-cm lateral resolution at 2 meters. There is a clear development path towards LIDAR on a chip technology that can reach a 100-meter range, with the possibility of going even farther.
DARPA had launched Modular Optical Aperture Building Blocks (MOABB) program with the aim to develop the advanced technologies it will take to build ultracompact light detection and ranging (LIDAR) systems, which use light to image objects and their motions in the same way that RADAR systems use radio waves.
The first phase of the program calls for researchers to develop the fundamental devices that will underlie the new LIDAR concept: speck-sized light-emitting and light-detecting cells capable of being readily integrated into larger arrays using typical semiconductor manufacturing processes. Phase 2 and Phase 3 of the project call for the integration of these cells into a 1 cm2 array and a 10 cm2 array comprising upwards of 100 and 10,000 unit cells, respectively.
With an integration of digital, electronic, optical and radiofrequency elements on a variety of combined semiconductor materials, the final 10-cm aperture LIDAR surface has the potential to be the most complex electronic-photonic circuit ever constructed, according to an anticipated Broad Agency
One of the most coveted applications that could emerge from the envisioned program, which could extend for five years with up to $58 million in funding, is foliage-penetrating imagers for spotting hidden threats under dense forests such as sniper or a tank
“You would be able to fly a MOABB-enabled helicopter or drone low over a lush forest canopy” and “it could instantaneously give you the range and velocity of everything up to a football field’s distance away with the resolution of a camera. And with accompanying visualization tools, he added, “you would feel like you are on the ground with nothing blocking your vision,” DARPA program manager Joshua Conway said.
Other potential applications include collision avoidance systems for small unmanned aerial vehicles (UAVs) maneuvering in tight indoor spaces, precision motor control for robotic limbs and fingers, high-capacity light-based communications and data-transfer systems, and sophisticated gaming or training modules in which LIDARs would open up new worlds of immersive experience just as GPS and motion-sensing accelerometers have done in today’s systems.
“Every machine that interacts with the 3D world—whether it is a manufacturing robot, UAV, car, or smartphone—could have a chip- or wafer-scale LIDAR on it,” Conway said.
US Army searches industry for state-of-the-art imaging LIDAR for tactical mapping from UAVs
Officials of the Army Contracting Command have issued a source-sought notice (W909MY-17-R-A006) for the LIDAR Payload for Manned and Unmanned Airborne Platforms. The LIDAR system shall be capable of supporting mission planning (also referred to as tactical mapping), concealed target detection (Multi-Aspect Foliage Penetration [FOPEN]) and mapping missions.
Of particular interest is LIDAR technologies mature enough for an advanced technology demonstrator prototype. The imaging sensor must have a pointing and stabilization unit such as a stabilized turret that houses the sensor optics, focal planes, and supporting electronics. The data storage and sensor processing has computer, data storage, sensor-processing algorithms, and imaging sensor command, control software.
The LIDAR shall have a variable swath from 200 meters or less to 1 km or more. To achieve the final product, the government will allow over-sampling and resampling. The LIDAR should be capable of reaching slant ranges of 25,000 feet. Range resolution, (ability to separate objects in range with a minimal number of laser pulses) shall be under one meter and range precision (statistical distribution of a number of measurements over a fixed range) shall be under 10 cm. Field of regard of the system shall be at least plus or minus 35 degrees in-track and 45 degrees cross-track.
The primary desire for this payload is for an unmanned airborne platform. As such, weight and aerodynamic drag (represented primarily by diameters of the front cross-section) are major concerns.
The LIDAR system shall consist of multiple Line Replaceable Units (LRUs) including but not limited to an Imaging Sensor Unit (ISU) and Storage and Processing Unit (SPU). The ISU consists of a pointing and stabilization unit (e.g. stabilized turret) that houses the sensor optics, focal planes and supporting electronics. Essentially everything required to operate the LIDAR sensor except for data storage and processing. If the ISU requires an external electronics box, the external box must be included in the ISU weight calculations.
The SPU refers to the storage and processing computer as well as the processing/ exploitation algorithms running on the hardware. The SPU will also host the ISU’s command, control and status software. This RFI is separated into two portions, a Vendor may optionally respond to one of the two sections or both sections. The first section requests information on the ISU; responses to this section are limited to 25 pages. The second section requests information on the SPU; responses to this section are limited to 20 pages.
LIDAR system’s PSU shall store and process the ISU’s imagery. The government desires automated feature extraction (AFE) and/or aided target recognition (AiTR) algorithms and transmission of a low-resolution orthographic projection of the human activity layer (HAL) annotated with anomaly detections. The Vendor, in response to the processing solution, should discuss any AFE and AiTR algorithms they currently possess. Examples include anomaly detection, segmentation, void detection, plane detection, or any algorithm or tool to minimize analysts’ workload and facilitate operation over limited bandwidth communications (<= 2 Mbps)
References and Resources also include:
http://www.design-engineering.com/foliage-penetrating-lidar-1004026949/
https://www.fbo.gov/index?s=opportunity&mode=form&tab=core&id=9c604bd41de0aa4fe7213412302aa6d2
http://www.militaryaerospace.com/articles/2017/01/lidar-tactical-mapping-uavs.html