Most detection systems are undermined by the very conditions that matter most — fast-moving targets, low light, cluttered backgrounds, and the need to stay hidden. ETC Robotics' neuromorphic event camera technology solves all of these at once. Unlike radar or lidar, our passive sensing system emits no signals, keeping your installation covert and secure. Unlike conventional cameras, each pixel responds to light changes independently with microsecond precision, eliminating motion blur entirely — so fast drones don't disappear into smear. The result is real-time 4D tracking with 100Hz+ update rates that performs in dawn, dusk, and twilight conditions, and reliably separates small aerial objects from dense foliage, water, and urban clutter. Available in both ground-based and airborne configurations, our systems deliver the situational awareness that high-stakes counter-UAS applications demand.
Understanding flying objects in the wild requires more than good algorithms — it requires data. ETC Robotics has assembled a large-scale proprietary dataset of flying objects captured with event cameras, representing one of the most comprehensive collections of its kind. Built on this foundation, our Flying Object Model is a purpose-trained foundation model that understands the unique spatiotemporal signatures of drones and other aerial objects in complex, real-world conditions. Developed by a team of robotics and machine learning PhDs with experience at NVIDIA Robotics and Google, the model is designed for high-precision, low-latency inference — starting from rich event data rather than compensating for the limitations of conventional sensors. Whether integrated into our hardware kits or deployed within your own pipeline, the Flying Object Model sets a new baseline for aerial object understanding.