Home / Technology / Photonics / DARPA FENCE Event-based imagers or neuromorphic camera

DARPA FENCE Event-based imagers or neuromorphic camera

An event camera, also known as a neuromorphic camera, silicon retina or dynamic vision sensor, is an imaging sensor that responds to local changes in brightness.  Event cameras do not capture images using a shutter as conventional cameras do. Instead, each pixel inside an event camera operates independently and asynchronously, reporting changes in brightness as they occur, and staying silent otherwise.  In contrast to standard cameras, which acquire full images at a rate specified by an external clock (e.g., 30 fps), event cameras, such as the Dynamic Vision Sensor (DVS),  respond to brightness changes in the scene asynchronously and independently for every pixel.

 

Each pixel stores a reference brightness level, and continuously compares it to the current level of brightness. If the difference in brightness exceeds a preset threshold, that pixel resets its reference level and generates an event; a discrete packet of information containing the pixel address and timestamp. While all event cameras respond to local changes in brightness, there are a few variants. Temporal contrast sensors (like the pioneering DVS (Dynamic Vision Sensor) or the sDVS (sensitive-DVS)) produce events that indicate polarity (increase or decrease in brightness), while temporal image sensors indicate the instantaneous intensity with each event.

 

Thus, event cameras output an asynchronous stream of events triggered by changes in scene illumination. Thus, the output of an event camera is a variable datarate sequence of digital “events” or “spikes”, with each event representing a change of brightness (log intensity) of predefined magnitude at a pixel at a particular time.

 

Event cameras are data-driven sensors: their output depends on the amount of motion or brightness change in the scene. The faster the motion, the more events per second are generated, since each pixel adapts its delta modulator sampling rate to the rate of change of the log intensity signal that it monitors. Events are timestamped with microsecond resolution and are transmitted with sub-millisecond latency,
which make these sensors react quickly to visual stimuli.Modern event cameras have microsecond temporal resolution, 120 dB dynamic range, and less under/overexposure and motion blur  than frame cameras.

 

Event-based imagers are an emerging class of sensors with major demonstrated advantages relative to traditional cameras. They offer significant advantages over standard cameras, namely a very high dynamic range, no motion blur, and a latency in the order of microseconds. Because they operate asynchronously and only transmit data from pixels that have changed, they have been shown to produce over 100x less data in sparse scenes relative to traditional focal plane arrays (FPAs). This leads directly to 100x lower latency at 100x lower power.

 

Image reconstruction from events has the potential to create images and video with high dynamic range, high temporal resolution and minimal motion blur. However, because the output is composed of a sequence of asynchronous events rather than actual intensity images, traditional vision algorithms cannot be applied, so that new algorithms that exploit the high temporal resolution and the asynchronous nature of the sensor are required. Image reconstruction can be achieved using temporal smoothing, e.g. high-pass or complementary filter. Alternative methods include optimization and gradient estimation[16] followed by Poisson integration.

 

The events are transmitted from the pixel array to periphery and then out of the camera using a shared digital output bus, typically by using address-event representation (AER) readout. This bus can become saturated, which perturbs the times that events are sent. Event cameras
have readout rates ranging from 2 MHz to 1200 MHz, depending on the chip and type of hardware interface.

 

State-of-the-art visible event-based cameras have been developed by iniVation in Switzerland and by Prophesee in France for applications that include autonomous vehicles, robotics, augmented/virtual reality, and video gaming.

 

Despite their inherent advantages, existing event-based cameras are not currently compatible with DoD applications as DoD scenarios are highly cluttered and dynamic. DARPA launched  FENCE program seeking to develop an integrated event-based infrared (IR) FPA with embedded processing to overcome these challenges.

 

Officials of the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va., issued a broad agency announcement in Oct 2020 (HR001121S0001) for the Fast Event-based Neuromorphic Camera and Electronics (FENCE) program.

 

FENCE program

The Fast Event-based Neuromorphic Camera and Electronics (FENCE) program seeks to develop and demonstrate a low latency, low power, event-based camera and a new class of signal processing and learning algorithms that use combined spatial and temporal (spatio-temporal)
information to enable intelligent sensors for tactical DoD applications.

 

FENCE will develop an infrared neuromorphic imager consistent with military requirements. The sole technical area (TA) will develop an asynchronous read-out integrated circuit (ROIC) capable of very low latency and power operation. It will also develop a low power processing
layer that integrates with the ROIC to identify salient spatio-temporal signals. The ROIC and the processing layer together will enable an integrated FENCE sensor to operate at low power (< 1.5 W).

 

The FENCE program is a 48-month, 3-phase program with a 15-month Phase 1 (base), 21-month Phase 2 (option), and 12-month Phase 3 (option). DARPA researchers are not interested in proposals that produce spiking event-driven cameras that are not cryogenically cooled or have cutoff wavelengths less than 3 microns; that use low technology readiness level (TRL) detector materials not fielded in military systems; or have amalgamations of existing imagers that are neuromorphic but that are not using the event-driven asynchronous methodology.

 

FENCE Awards

The US Defense Advanced Research Projects Agency (DARPA) has selected three teams of researchers to develop event-based (neuromorphic) infrared (IR) camera technologies.

 

The three teams are being led by Raytheon, BAE Systems, and Northrop Grumman. The development will be carried out under the Fast Event-based Neuromorphic Camera and Electronics (FENCE) programme. FENCE programme manager Whitney Mason said: “Neuromorphic refers to silicon circuits that mimic brain operation; they offer sparse output, low latency, and high energy efficiency. “Event-based cameras operate under these same principles when dealing with sparse scenes, but currently lack advanced ‘intelligence’ to perform more difficult perception and control tasks.”

 

Researchers from Raytheon, BAE, and Northrop will work to develop an asynchronous read-out integrated circuit (ROIC) with low-latency and a processing layer that integrates with the ROIC to detect relevant ‘spatial and temporal signals’. According to DARPA, the ROIC and processing layer will jointly enable an integrated FENCE sensor to operate on less than 1.5W of power. Mason added: “The goal is to develop a ‘smart’ sensor that can intelligently reduce the amount of information that is transmitted from the camera, narrowing down the data for consideration to only the most relevant pixels.”

 

About Rajesh Uppal

Check Also

Revolutionizing Space Travel: Harnessing Laser Directed Energy Propulsion for Interplanetary Missions

Introduction: The future of space exploration is marked by increasingly ambitious plans to venture deeper …

error: Content is protected !!