26 research outputs found

    Event-based object detection and tracking for space situational awareness

    Get PDF
    In this work, we present an optical space imaging dataset using a range of event-based neuromorphic vision sensors. The unique method of operation of event-based sensors makes them ideal for space situational awareness (SSA) applications due to the sparseness inherent in space imaging data. These sensors offer significantly lower bandwidth and power requirements making them particularly well suited for use in remote locations and space-based platforms. We present the first publicly-accessible event-based space imaging dataset including recordings using sensors from multiple providers, greatly lowering the barrier to entry for other researchers given the scarcity of such sensors and the expertise required to operate them for SSA applications. The dataset contains both day time and night time recordings, including simultaneous co-collections from different event-based sensors. Recorded at a remote site, and containing 572 labeled targets with a wide range of sizes, trajectories, and signal-to-noise ratios, this real-world event-based dataset represents a challenging detection and tracking task that is not readily solved using previously proposed methods. We propose a highly optimized and robust feature-based detection and tracking method, designed specifically for SSA applications, and implemented via a cascade of increasingly selective event filters. These filters rapidly isolate events associated with space objects, maintaining the high temporal resolution of the sensors. The results from this simple yet highly optimized algorithm on the space imaging dataset demonstrate robust high-speed event-based detection and tracking which can readily be implemented on sensor platforms in space as well as terrestrial environments

    Event-based feature extraction using adaptive selection thresholds

    Get PDF
    Unsupervised feature extraction algorithms form one of the most important building blocks in machine learning systems. These algorithms are often adapted to the event-based domain to perform online learning in neuromorphic hardware. However, not designed for the purpose, such algorithms typically require significant simplification during implementation to meet hardware constraints, creating trade offs with performance. Furthermore, conventional feature extraction algorithms are not designed to generate useful intermediary signals which are valuable only in the context of neuromorphic hardware limitations. In this work a novel event-based feature extraction method is proposed that focuses on these issues. The algorithm operates via simple adaptive selection thresholds which allow a simpler implementation of network homeostasis than previous works by trading off a small amount of information loss in the form of missed events that fall outside the selection thresholds. The behavior of the selection thresholds and the output of the network as a whole are shown to provide uniquely useful signals indicating network weight convergence without the need to access network weights. A novel heuristic method for network size selection is proposed which makes use of noise events and their feature representations. The use of selection thresholds is shown to produce network activation patterns that predict classification accuracy allowing rapid evaluation and optimization of system parameters without the need to run back-end classifiers. The feature extraction method is tested on both the N-MNIST (Neuromorphic-MNIST) benchmarking dataset and a dataset of airplanes passing through the field of view. Multiple configurations with different classifiers are tested with the results quantifying the resultant performance gains at each processing stage

    Real-time event-based unsupervised feature consolidation and tracking for space situational awareness

    Get PDF
    Earth orbit is a limited natural resource that hosts a vast range of vital space-based systems that support the international community's national, commercial and defence interests. This resource is rapidly becoming depleted with over-crowding in high demand orbital slots and a growing presence of space debris. We propose the Fast Iterative Extraction of Salient targets for Tracking Asynchronously (FIESTA) algorithm as a robust, real-time and reactive approach to optical Space Situational Awareness (SSA) using Event-Based Cameras (EBCs) to detect, localize, and track Resident Space Objects (RSOs) accurately and timely. We address the challenges of the asynchronous nature and high temporal resolution output of the EBC accurately, unsupervised and with few tune-able parameters using concepts established in the neuromorphic and conventional tracking literature. We show this algorithm is capable of highly accurate in-frame RSO velocity estimation and average sub-pixel localization in a simulated test environment to distinguish the capabilities of the EBC and optical setup from the proposed tracking system. This work is a fundamental step toward accurate end-to-end real-time optical event-based SSA, and developing the foundation for robust closed-form tracking evaluated using standardized tracking metrics

    Neuromorphic engineering needs closed-loop benchmarks

    Get PDF
    Neuromorphic engineering aims to build (autonomous) systems by mimicking biological systems. It is motivated by the observation that biological organisms—from algae to primates—excel in sensing their environment, reacting promptly to their perils and opportunities. Furthermore, they do so more resiliently than our most advanced machines, at a fraction of the power consumption. It follows that the performance of neuromorphic systems should be evaluated in terms of real-time operation, power consumption, and resiliency to real-world perturbations and noise using task-relevant evaluation metrics. Yet, following in the footsteps of conventional machine learning, most neuromorphic benchmarks rely on recorded datasets that foster sensing accuracy as the primary measure for performance. Sensing accuracy is but an arbitrary proxy for the actual system's goal—taking a good decision in a timely manner. Moreover, static datasets hinder our ability to study and compare closed-loop sensing and control strategies that are central to survival for biological organisms. This article makes the case for a renewed focus on closed-loop benchmarks involving real-world tasks. Such benchmarks will be crucial in developing and progressing neuromorphic Intelligence. The shift towards dynamic real-world benchmarking tasks should usher in richer, more resilient, and robust artificially intelligent systems in the future

    Gooaall!!! : why we built a neuromorphic robot to play foosball

    No full text
    For the past 25 Years or so, those of us who seek to mimic the brain's workings in silicon have held an annual workshop in the mountain town of Telluride, Colo. During those summer weeks, you can often find the participants unwinding at the bar of the New Sheridan Hotel on the town's main street. As far back as most can remember, there has been a foosball table in the bar's back room. During the weeks of the workshop, you'll usually find it surrounded by a cluster of neuromorphic engineers engaged in a friendly rivalry that has spanned many years. It was therefore almost a foregone conclusion that someone was going to build a neuromorphic-robot foosball table

    Automated detection of sleep apnea in infants : a multi-modal approach

    No full text
    This study explores the use and applicability of two minimally invasive sensors, electrocardiogram (ECG) and pulse oximetry, in addressing the high costs and difficulty associated with the early detection of sleep apnea hypopnea syndrome in infants. An existing dataset of 396 scored overnight polysomnography recordings were used to train and test a linear discriminants classifier. The dataset contained data from healthy infants, infants diagnosed with sleep apnea, infants with siblings who had died from sudden infant death syndrome (SIDS) and pre-term infants. Features were extracted from the ECG and pulse-oximetry data and used to train the classifier. The performance of the classifier was evaluated using a leave-one-out cross-validation scheme and an accuracy of 66.7% was achieved, with a specificity of 67.0% and a sensitivity of 58.1%. Although the performance of the system is not yet at the level required for clinical use, this work forms an important step in demonstrating the validity and potential for such low-cost and minimally invasive diagnostic systems

    Automated detection of sleep apnea in infants using minimally invasive sensors

    No full text
    To address the difficult and necessity of early detection of sleep apnea hypopnea syndrome in infants, we present a study into the effectiveness of pulse oximetry as a minimally invasive means of automated diagnosis of sleep apnea in infants. Overnight polysomnogram data from 328 infants were used to extract time-domain based oximetry features and scored arousal data for each subject. These records were then used to determine apnea events and to train a classifier model based on linear discriminants. Performance of the classifier was evaluated using a leave-one-out cross-validation scheme and an accuracy of 68% was achieved, with a specificity of 68.6% and a sensitivity of 55.9%

    Probabilistic multi hypothesis tracker for an event based sensor

    No full text
    The Event-Based Sensor (EBS) is a new class of imaging sensor where each pixel independently reports “events” in response to changes in log intensity, rather than outputting image frames containing the absolute intensity at each pixel. Positive and negative events are emitted from the sensor when the change in log intensity exceeds certain controllable thresholds internal to the device. For objects moving through the field of view, a change in intensity can be related to motion. The sensor records events independently and asynchronously for each pixel with a very high temporal resolution, allowing the detection of objects moving very quickly through the field of view. Recently this type of sensor has been applied to the detection of orbiting space objects using a ground-based telescope. This paper describes a method to treat the data generated by the EBS as a classical detect-then-track problem by collating the events spatially and temporally to form target measurements. An efficient multi-target tracking algorithm, the probabilistic multi-hypothesis tracker (PMHT) is then applied to the EBS measurements to produce tracks. This method is demonstrated by automatically generating tracks on orbiting space objects from data collected by the EBS

    Approaches for astrometry using event-based sensors

    No full text
    Event-based sensors are novel optical imaging devices that offer a different paradigm in which to image space and resident space objects. Also known as silicon retinas, these custom silicon devices make use of independent and asynchronous pixels which produce data in the form of events generated in response to changes in log-illumination rather than in the conventional frames produced by CCD-based imaging sensors. This removes the need for fixed exposure times and frame rates but requires new approaches to processing and interpreting the spatio-temporal data produced by these sensors. The individual nature of each pixel also yields a very high dynamic range, and the asynchronous operation provides a high temporal resolution. These characteristics make event-based cameras well suited to terrestrial and orbital space situational awareness tasks. Our previous work with these sensors highlighted the applicability of these devices for detecting and tracking resident space objects from LEO to GEO orbital regimes, both during the night and daytime without modification to the camera or optics. Building upon this previous work in applying these artificial vision systems to space situational awareness tasks, we present a study into approaches for calculating astrometry from the event-based data generated with these devices. The continuous nature of these devices, and their ability to image whilst moving, allows for new and computationally efficient approaches to astrometry, applicable both to high-speed tracking from terrestrial sensors and low-power imaging from orbital platforms. Using data collected during multiple sets of telescope trials involving co-collections between a conventional sensor and multiple event-based sensors, a system capable of identifying stars and positional information whilst simultaneously tracking an object is presented. Two new prototype event-based sensors, offering increased spatial resolution and higher sensitivity, were also used and characterized in the trial, and updated observation results from these improved sensors are presented. These results further demonstrate and validate the applicability and opportunities offered by event-based sensors for space situational awareness and orbital applications

    Shack-Hartmann wavefront sensing using spatial-temporal data from an event-based image sensor

    No full text
    An event-based image sensor works dramatically differently from the conventional frame-based image sensors in a way that it only responds to local brightness changes whereas its counterparts’ output is a linear representation of the illumination over a fixed exposure time. The output of an event-based image sensor therefore is an asynchronous stream of spatial-temporal events data tagged with the location, timestamp and polarity of the triggered events. Compared to traditional frame-based image sensors, event-based image sensors have advantages of high temporal resolution, low latency, high dynamic range and low power consumption. Although event-based image sensors have been used in many computer vision, navigation and even space situation awareness applications, little work has been done to explore their applicability in the field of wavefront sensing. In this work, we present the integration of an event camera in a Shack-Hartmann wavefront sensor and the usage of event data to determine spot displacement and wavefront estimation. We show that it can achieve the same functionality but with substantial speed and can operate in extremely low light conditions. This makes an event-based Shack-Hartmann wavefront sensor a preferable choice for adaptive optics systems where light budget is limited or high bandwidth is required
    corecore