Yes, fast-moving drones are harder to detect and track because their movement leads to motion-blur. ETC Robotics event camera technology does not suffer from motion blur.
Active sensors, like radar and lidar, are constantly transmitting signals which reveals their locations. Passive sensors, like the event cameras we use, only measure incoming light and do not emit any signals.
Not all drones are piloted by RF or emit RF. Some drones are controlled by fiber optics. A drone might also use the RF spectrum in a way to evade detection.
Conventional cameras have poor detection performance in low-light (e.g., dawn, dusk, twilight) settings. ETC event camera systems have much better sensitivity in these conditions.
It is easy to spot a drone if the background is empty blue sky. Real world conditions often involve cluttered dynamic backgrounds such as dense foliage, water, and urban architecture. ETC technology is effective at separating out small, fast moving objects from these cluttered backdrops.
Conventional cameras work like old fashioned film cameras: an electronic shutter opens for a fixed amount of time and then it closes again. If the camera or the world moves while the shutter is open, light will get smeared across the sensor producing motion blur. This blur makes it much more difficult to identify objects, localize where they are, and measure precisely how they are moving. Event cameras do not use a shutter for the whole sensor; each pixel in the sensor responds to light changes independently with very high time precision. So when something moves fast, we know exactly where and when it traveled, allowing us to arrive at a much more accurate understanding of the object's position and speed.
Event cameras provide superior spatial and temporal resolution, which allows them to more readily distinguish small, fast flying objects from cluttered moving backgrounds. If the radar sensor itself is moving, these problems are compounded. Finally, radar is only well-suited for measuring velocity towards the sensor which is an incomplete understanding of a flying object's 3D motion.
If information is missing or has been destroyed, AI/ML models can be used to guess what the original information might have looked like. These techniques have proven very popular in entertainment and productivity applications, where a plausible high resolution result is convincing and it can be repeated until it looks right. For high stakes, time-sensitive applications, using AI/ML models to provide cover for low precision sensor data could be misleading and the computation itself delays timely action. For these critical applications, it is much better to start with high precision sensing data from the beginning.
Event cameras are manufactured using the same silicon process as conventional cameras, so cost is mostly a function of volume. As event cameras have become more popular, their production costs have come down such that they are now less expensive than thermal cameras.
Two reasons companies are complementing their sensor arrays with event cameras now are pricing trends now amenable to large scale installations and industrial awareness and expertise as pioneered by Et Cetera Robotics.
Et Cetera Robotics brings together robotics and machine learning PhDs with industrial experience at NVIDIA Robotics and Google. ETC Robotics has developed a proprietary high-speed vision and tracking system built on the latest event camera systems. The company has used this expertise to assemble a large-scale dataset of flying objects for testing and model training purposes.