library
Event cameras represent a fundamental shift in the way sensors collect and process scene data. They offer significant benefits but require us to rethink how "images" are constructed and what cameras can do. IQT Labs therefore investigated event cameras as part of its R&D focus on Edge AI. In this blog post we describe several experiments we conducted while evaluating an event camera produced by the French startup Prophesee. We found that event camera was able to sense things a traditional camera would miss and could provide an additional method for sensing drones.
What is an Event Camera?
In a conventional digital camera, an optical sensor records data for every pixel in the frame at the same time. Frame-based video capture is repeated at a predetermined frame rate. A frame rate of 30 frames per second (fps) means that data about the entire scene is sampled 30 times per second. When this series of collected still images is played back, the human eye perceives it as motion. The sensor is essentially blind, however, to any changes that occur between the captured frames.
In contrast, every pixel in an event-based sensor operates independently and asynchronously, responding directly to what is happening in the scene. The sensor captures data for each pixel only when it detects a change in the amount of light hitting the sensor above a configurable threshold. This approach can be much more efficient than conventional cameras. A scene (or part of a scene) is sampled more rapidly when something is changing, which avoids wasting energy and oversampling pixels that remain static. The human vision system uses a similar change-sensitive strategy, sending more data back to the brain when an object moves.
An event-based sensor provides several benefits over a traditional optical sensor. Event cameras have a higher temporal resolution, allowing them to capture fast moving objects without motion blur. Since data for each pixel is captured independently, event cameras also have higher dynamic range, making them suitable for capturing data in scenes that contain the high contrast of extreme darkness and bright light. Their selective pixel capture also provides an efficiency that can make them appropriate for power-constrained, remote environments. In other words, since the camera only generates data when it senses changes, the processor can switch to a lower power state during static scenes, further reducing power usage. Prophesee, for example, states that their camera can "produce up to 1,000 times fewer data than a conventional sensor whilst achieving a higher equivalent temporal resolution of >10,000 fps."
The Main Event: Three Experiments
To illustrate the capabilities of Prophesee's event camera, we performed three separate mini-experiments, all of which involved capturing the same scene with both an event camera and an optical camera. These experiments were chosen to highlight several of the unique capabilities of the event camera compared to a traditional frame-based camera.
Experiment 1. To understand the event camera's ability to isolate moving objects in an otherwise static image, we first aimed the cameras at passing cars caught from an elevated angle (Figure 1). The red boxes in the optical image on the right side of the panel (added by us) denote three moving cars. All the other cars in the scene are parked. As a result, the event camera image is largely black, indicating no events, except for three splotches of white and blue, each roughly the size of a car. These splotches reveal changes in the scene, created by the moving cars.
Experiment 2. To observe how the event camera measures rapid changes, we plugged two LED lights into an Arduino. We programmed one light to blink rapidly on and off. The other light remained always on. As shown in Figure 2, the optical image, on the right, shows two red lights, highlighted by the light blue boxes. Notice that the event camera only shows one splash of color. The event camera only detects the flickering light. The other, always-on, light fails to trigger any events. This means that, as we expected, the event camera records nothing for the other light. Our experimentation revealed that the Prophesee event camera could reliably measure a blinking LED at close range at up to 1400 Hz. We evaluated multiple blinking frequency and found that it was able to measure them with an accuracy of around 1Hz.
Experiment 3. We also investigated what signatures an event camera could detect for a quadcopter in flight. We aimed the optical camera and an event camera at a quadcopter hovering in a controlled lab environment and used the vibration frequency detection mode of the event camera. This yielded promising results: the rotors of the quadcopter appear as bright red in the event camera image, suggesting that event cameras might aid in quadcopter detection. In addition to seeing the rotors' motion clearly in the event frame, the frequency reported by the event camera software matched the rotor speed as measured on an audio spectrogram. The frequency ranged between 210Hz and 235Hz, depending on the current power level.
Some Notes on Tooling
For the nerdy at heart, we have a few notes on the tooling used for our experiments. First, we built a small rig on top of a tripod to hold an optical camera and an event camera side-by-side. Second, simple Arduino scripts powered the LED lights used in Figure 2. We confirmed the LED was blinking at the correct rate using an oscilloscope as shown in Figure 4. And, finally, we spliced together the camera imagery from the optical camera and the event camera using OBS Studio, a free and software for video broadcasting and live streaming.
What We Learned
Based on our experiments, we saw that event cameras can excel at some tasks that are challenging for frame-based cameras. The first experiment revealed how an event camera can isolate moving objects from a static background, a feature that could simplify object tracking. In our second experiment, we saw that an event camera accurately measured the LED's high frequency blinking. Finally, the third experiment demonstrates that event cameras can detect novel signatures from a drone that a standard frame-based camera would otherwise miss. Do you know of additional areas where event cameras could excel? We are interested in investigating how they could improve certain machine learning task. Let us know your thoughts by contacting us at labsinfo@iqt.org.