10,015 research outputs found
Event-based Vision: A Survey
Event cameras are bio-inspired sensors that differ from conventional frame
cameras: Instead of capturing images at a fixed rate, they asynchronously
measure per-pixel brightness changes, and output a stream of events that encode
the time, location and sign of the brightness changes. Event cameras offer
attractive properties compared to traditional cameras: high temporal resolution
(in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low
power consumption, and high pixel bandwidth (on the order of kHz) resulting in
reduced motion blur. Hence, event cameras have a large potential for robotics
and computer vision in challenging scenarios for traditional cameras, such as
low-latency, high speed, and high dynamic range. However, novel methods are
required to process the unconventional output of these sensors in order to
unlock their potential. This paper provides a comprehensive overview of the
emerging field of event-based vision, with a focus on the applications and the
algorithms developed to unlock the outstanding properties of event cameras. We
present event cameras from their working principle, the actual sensors that are
available and the tasks that they have been used for, from low-level vision
(feature detection and tracking, optic flow, etc.) to high-level vision
(reconstruction, segmentation, recognition). We also discuss the techniques
developed to process events, including learning-based techniques, as well as
specialized processors for these novel sensors, such as spiking neural
networks. Additionally, we highlight the challenges that remain to be tackled
and the opportunities that lie ahead in the search for a more efficient,
bio-inspired way for machines to perceive and interact with the world
Multisensor data fusion for accurate modelling of mobile objects
In the last decade, multi-sensor data fusion has become a broadly demanded discipline to achieve advanced solutions that can be applied in many real world situations, either civil or military. In Defence,accurate detection of all target objects is fundamental to maintaining situational awareness, to locating threats in the battlefield and to identifying and protecting strategically own forces. Civil applications, such as traffic monitoring, have similar requirements in terms of object detection and reliable identification of incidents in order to ensure safety of road users. Thanks to the appropriate data fusion technique, we can give these systems the power to exploit automatically all relevant information from
multiple sources to face for instance mission needs or assess daily supervision operations. This paper
focuses on its application to active vehicle monitoring in a particular area of high density traffic, and how
it is redirecting the research activities being carried out in the computer vision, signal processing and
machine learning fields for improving the effectiveness of detection and tracking in ground surveillance
scenarios in general. Specifically, our system proposes fusion of data at a feature level which is extracted
from a video camera and a laser scanner. In addition, a stochastic-based tracking which introduces some
particle filters into the model to deal with uncertainty due to occlusions and improve the previous detection output is presented in this paper. It has been shown that this computer vision tracker contributes to detect objects even under poor visual information. Finally, in the same way that humans are able to
analyze both temporal and spatial relations among items in the scene to associate them a meaning, once
the targets objects have been correctly detected and tracked, it is desired that machines can provide a
trustworthy description of what is happening in the scene under surveillance. Accomplishing so ambitious
task requires a machine learning-based hierarchic architecture able to extract and analyse behaviours at
different abstraction levels. A real experimental testbed has been implemented for the evaluation of the
proposed modular system. Such scenario is a closed circuit where real traffic situations can be simulated.
First results have shown the strength of the proposed system
Keeping track of worm trackers
C. elegans is used extensively as a model system in the neurosciences due to its well defined nervous system. However, the seeming simplicity of this nervous system in anatomical structure and neuronal connectivity, at least compared to higher animals, underlies a rich diversity of behaviors. The usefulness of the worm in genome-wide mutagenesis or RNAi screens, where thousands of strains are assessed for phenotype, emphasizes the need for computational methods for automated parameterization of generated behaviors. In addition, behaviors can be modulated upon external cues like temperature, O2 and CO2 concentrations, mechanosensory and chemosensory inputs. Different machine vision tools have been developed to aid researchers in their efforts to inventory and characterize defined behavioral “outputs”. Here we aim at providing an overview of different worm-tracking packages or video analysis tools designed to quantify different aspects of locomotion such as the occurrence of directional changes (turns, omega bends), curvature of the sinusoidal shape (amplitude, body bend angles) and velocity (speed, backward or forward movement)
RFID Localisation For Internet Of Things Smart Homes: A Survey
The Internet of Things (IoT) enables numerous business opportunities in
fields as diverse as e-health, smart cities, smart homes, among many others.
The IoT incorporates multiple long-range, short-range, and personal area
wireless networks and technologies into the designs of IoT applications.
Localisation in indoor positioning systems plays an important role in the IoT.
Location Based IoT applications range from tracking objects and people in
real-time, assets management, agriculture, assisted monitoring technologies for
healthcare, and smart homes, to name a few. Radio Frequency based systems for
indoor positioning such as Radio Frequency Identification (RFID) is a key
enabler technology for the IoT due to its costeffective, high readability
rates, automatic identification and, importantly, its energy efficiency
characteristic. This paper reviews the state-of-the-art RFID technologies in
IoT Smart Homes applications. It presents several comparable studies of RFID
based projects in smart homes and discusses the applications, techniques,
algorithms, and challenges of adopting RFID technologies in IoT smart home
systems.Comment: 18 pages, 2 figures, 3 table
Particle Filter for Targets Tracking with Motion Model
Real-time robust tracking for multiple non-rigid objects is a challenging task in computer vision research. In recent years, stochastic sampling based particle filter has been widely used to describe the complicated target features of image sequence. In this paper, non-parametric density estimation and particle filter techniques are employed to model the background and track the object. Color feature and motion model of the target are extracted and used as key features in the tracking step, in order to adapt to multiple variations in the scene, such as background clutters, object's scale change and partial overlap of different targets. The paper also presents the experimental result on the robustness and effectiveness of the proposed method in a number of outdoor and indoor visual surveillance scenes.published_or_final_versio
Hardware/Software Co-design of Particle Filter and Its Application in Object Tracking
[[abstract]]This paper presents a hardware/software co-design method for particle filter based on System On Program Chip (SOPC) technique. Considering both the execution speed and design flexibility, we use a NIOS II processor to calculate weight for each particle and a hardware accelerator to update particles. As a result, execution efficiency of the proposed hardware/software co-design method of particle filter is significantly improved while maintaining design flexibility for various applications. To demonstrate the performance of the proposed approach, a real-time object tracking system is established and presented in this paper. Experimental results have demonstrated the proposed method have satisfactory results in real-time tracking of objects in video sequences.[[conferencetype]]國際[[conferencedate]]20110608~20110610[[conferencelocation]]Macao, Chin
NASA Tech Briefs Index, 1977, volume 2, numbers 1-4
Announcements of new technology derived from the research and development activities of NASA are presented. Abstracts, and indexes for subject, personal author, originating center, and Tech Brief number are presented for 1977
- …