81,661 research outputs found
A sub-mW IoT-endnode for always-on visual monitoring and smart triggering
This work presents a fully-programmable Internet of Things (IoT) visual
sensing node that targets sub-mW power consumption in always-on monitoring
scenarios. The system features a spatial-contrast binary
pixel imager with focal-plane processing. The sensor, when working at its
lowest power mode ( at 10 fps), provides as output the number of
changed pixels. Based on this information, a dedicated camera interface,
implemented on a low-power FPGA, wakes up an ultra-low-power parallel
processing unit to extract context-aware visual information. We evaluate the
smart sensor on three always-on visual triggering application scenarios.
Triggering accuracy comparable to RGB image sensors is achieved at nominal
lighting conditions, while consuming an average power between and
, depending on context activity. The digital sub-system is extremely
flexible, thanks to a fully-programmable digital signal processing engine, but
still achieves 19x lower power consumption compared to MCU-based cameras with
significantly lower on-board computing capabilities.Comment: 11 pages, 9 figures, submitteted to IEEE IoT Journa
Bridges Structural Health Monitoring and Deterioration Detection Synthesis of Knowledge and Technology
INE/AUTC 10.0
Event tracking for real-time unaware sensitivity analysis (EventTracker)
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.This paper introduces a platform for online Sensitivity Analysis (SA) that is applicable in large scale real-time data acquisition (DAQ) systems. Here we use the term real-time in the context of a system that has to respond to externally generated input stimuli within a finite and specified period. Complex industrial systems such as manufacturing, healthcare, transport, and finance require high quality information on which to base timely responses to events occurring in their volatile environments. The motivation for the proposed EventTracker platform is the assumption that modern industrial systems are able to capture data in real-time and have the necessary technological flexibility to adjust to changing system requirements. The flexibility to adapt can only be assured if data is succinctly interpreted and translated into corrective actions in a timely manner. An important factor that facilitates data interpretation and information modelling is an appreciation of the affect system inputs have on each output at the time of occurrence. Many existing sensitivity analysis methods appear to hamper efficient and timely analysis due to a reliance on historical data, or sluggishness in providing a timely solution that would be of use in real-time applications. This inefficiency is further compounded by computational limitations and the complexity of some existing models. In dealing with real-time event driven systems, the underpinning logic of the proposed method is based on the assumption that in the vast majority of cases changes in input variables will trigger events. Every single or combination of events could subsequently result in a change to the system state. The proposed event tracking sensitivity analysis method describes variables and the system state as a collection of events. The higher the numeric occurrence of an input variable at the trigger level during an event monitoring interval, the greater is its impact on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis method with a comparable method (that of Entropy). An improvement of 10% in computational efficiency without loss in accuracy was observed. The comparison also showed that the time taken to perform the sensitivity analysis was 0.5% of that required when using the comparable Entropy based method.EPSR
Recommended from our members
SEIS: Insight's Seismic Experiment for Internal Structure of Mars.
By the end of 2018, 42 years after the landing of the two Viking seismometers on Mars, InSight will deploy onto Mars' surface the SEIS (Seismic Experiment for Internal Structure) instrument; a six-axes seismometer equipped with both a long-period three-axes Very Broad Band (VBB) instrument and a three-axes short-period (SP) instrument. These six sensors will cover a broad range of the seismic bandwidth, from 0.01 Hz to 50 Hz, with possible extension to longer periods. Data will be transmitted in the form of three continuous VBB components at 2 sample per second (sps), an estimation of the short period energy content from the SP at 1 sps and a continuous compound VBB/SP vertical axis at 10 sps. The continuous streams will be augmented by requested event data with sample rates from 20 to 100 sps. SEIS will improve upon the existing resolution of Viking's Mars seismic monitoring by a factor of ⌠2500 at 1 Hz and ⌠200 000 at 0.1 Hz. An additional major improvement is that, contrary to Viking, the seismometers will be deployed via a robotic arm directly onto Mars' surface and will be protected against temperature and wind by highly efficient thermal and wind shielding. Based on existing knowledge of Mars, it is reasonable to infer a moment magnitude detection threshold of M w ⌠3 at 40 â epicentral distance and a potential to detect several tens of quakes and about five impacts per year. In this paper, we first describe the science goals of the experiment and the rationale used to define its requirements. We then provide a detailed description of the hardware, from the sensors to the deployment system and associated performance, including transfer functions of the seismic sensors and temperature sensors. We conclude by describing the experiment ground segment, including data processing services, outreach and education networks and provide a description of the format to be used for future data distribution.Electronic supplementary materialThe online version of this article (10.1007/s11214-018-0574-6) contains supplementary material, which is available to authorized users
J-PET Framework: Software platform for PET tomography data reconstruction and analysis
J-PET Framework is an open-source software platform for data analysis,
written in C++ and based on the ROOT package. It provides a common environment
for implementation of reconstruction, calibration and filtering procedures, as
well as for user-level analyses of Positron Emission Tomography data. The
library contains a set of building blocks that can be combined by users with
even little programming experience, into chains of processing tasks through a
convenient, simple and well-documented API. The generic input-output interface
allows processing the data from various sources: low-level data from the
tomography acquisition system or from diagnostic setups such as digital
oscilloscopes, as well as high-level tomography structures e.g. sinograms or a
list of lines-of-response. Moreover, the environment can be interfaced with
Monte Carlo simulation packages such as GEANT and GATE, which are commonly used
in the medical scientific community.Comment: 14 pages, 5 figure
Science and Applications Space Platform (SASP) End-to-End Data System Study
The capability of present technology and the Tracking and Data Relay Satellite System (TDRSS) to accommodate Science and Applications Space Platforms (SASP) payload user's requirements, maximum service to the user through optimization of the SASP Onboard Command and Data Management System, and the ability and availability of new technology to accommodate the evolution of SASP payloads were assessed. Key technology items identified to accommodate payloads on a SASP were onboard storage devices, multiplexers, and onboard data processors. The primary driver is the limited access to TDRSS for single access channels due to sharing with all the low Earth orbit spacecraft plus shuttle. Advantages of onboard data processing include long term storage of processed data until TRDSS is accessible, thus reducing the loss of data, eliminating large data processing tasks at the ground stations, and providing a more timely access to the data
Event-based Vision: A Survey
Event cameras are bio-inspired sensors that differ from conventional frame
cameras: Instead of capturing images at a fixed rate, they asynchronously
measure per-pixel brightness changes, and output a stream of events that encode
the time, location and sign of the brightness changes. Event cameras offer
attractive properties compared to traditional cameras: high temporal resolution
(in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low
power consumption, and high pixel bandwidth (on the order of kHz) resulting in
reduced motion blur. Hence, event cameras have a large potential for robotics
and computer vision in challenging scenarios for traditional cameras, such as
low-latency, high speed, and high dynamic range. However, novel methods are
required to process the unconventional output of these sensors in order to
unlock their potential. This paper provides a comprehensive overview of the
emerging field of event-based vision, with a focus on the applications and the
algorithms developed to unlock the outstanding properties of event cameras. We
present event cameras from their working principle, the actual sensors that are
available and the tasks that they have been used for, from low-level vision
(feature detection and tracking, optic flow, etc.) to high-level vision
(reconstruction, segmentation, recognition). We also discuss the techniques
developed to process events, including learning-based techniques, as well as
specialized processors for these novel sensors, such as spiking neural
networks. Additionally, we highlight the challenges that remain to be tackled
and the opportunities that lie ahead in the search for a more efficient,
bio-inspired way for machines to perceive and interact with the world
- âŠ