Event-based cameras introduce an innovation in the field of imaging by departing from the conventional frame camera paradigm. These bio-inspired sensors capture per-pixel brightness changes asynchronously, generating a continuous stream of events that encode information such as time, pixel location, and polarity alterations. This departure from traditional imaging methods results in distinctive features including high temporal resolution, a high dynamic range, low power consumption, and minimized motion blur. Additionally, event cameras are ideal for motion- oriented applications as they remove superfluous information, optimizing data generation and algorithmic performance. Such characteristics position event-based cameras as useful tools in challenging scenarios as in the realm of object observation in orbit. This paper presents a comprehensive mission architecture study for our nanosatellite space mission, called EventSat, that focuses on autonomous object detection, classification, and identification in Low Earth Orbit (LEO). The main objective of the mission is to leverage the unique capabilities of event-based sensors for enhancing Space Situational Awareness and autonomous space operations. The nanosatellite mission aims to demonstrate event-based technology to address the growing challenges of monitoring and understanding objects in space, particularly in the densely populated LEO for the sustained, long-term usage of orbital resources through the technological demonstration of event cameras integrated with onboard AI. The core components of the proposed mission include the integration of event-based cameras with advanced onboard artificial intelligence (AI) systems. This integration holds the promise of significantly advancing autonomous object detection, classification, and identification techniques in space. The ability to exploit the inherent advantages of event-based cameras, such as high temporal resolution and very high dynamic range, can lead to more accurate and timely Space Situational Awareness. The autonomous onboard AI algorithms can empower data sharing between federated spacecraft for automatic maneuvering and collision avoidance, among others. Furthermore, this paper delves into preliminary payload parameters and demonstrates the viability to use an event camera payload as a proof of concept on a 6U platform, selecting an initial range of focal length for further investigations. The presented analysis shows what payload parameters will maximize the product of pixel count and number of objects, considering looking directly at known objects or looking for chance encounters. We propose how the payload could be used in space in accordance to the selected parameters and point out further work that will allow us to refine and select from the identified possibilities. EventSat aims to bridge the gap between cutting-edge sensor technology and practical space applications, ultimately fostering advancements in space exploration and surveillance capabilities, and empowering space autonomy for safer and more optimal space operations.
«
Event-based cameras introduce an innovation in the field of imaging by departing from the conventional frame camera paradigm. These bio-inspired sensors capture per-pixel brightness changes asynchronously, generating a continuous stream of events that encode information such as time, pixel location, and polarity alterations. This departure from traditional imaging methods results in distinctive features including high temporal resolution, a high dynamic range, low power consumption, and minimize...
»