693 research outputs found

    A real time vehicles detection algorithm for vision based sensors

    Full text link
    A vehicle detection plays an important role in the traffic control at signalised intersections. This paper introduces a vision-based algorithm for vehicles presence recognition in detection zones. The algorithm uses linguistic variables to evaluate local attributes of an input image. The image attributes are categorised as vehicle, background or unknown features. Experimental results on complex traffic scenes show that the proposed algorithm is effective for a real-time vehicles detection.Comment: The final publication is available at http://www.springerlink.co

    OPERAnet:A Multimodal Activity Recognition Dataset Acquired from Radio Frequency and Vision-based Sensors

    Get PDF
    This paper presents a comprehensive dataset intended to evaluate passive Human Activity Recognition (HAR) and localization techniques with measurements obtained from synchronized Radio-Frequency (RF) devices and vision-based sensors. The dataset consists of RF data including Channel State Information (CSI) extracted from a WiFi Network Interface Card (NIC), Passive WiFi Radar (PWR) built upon a Software Defined Radio (SDR) platform, and Ultra-Wideband (UWB) signals acquired via commercial off-the-shelf hardware. It also consists of vision/Infra-red based data acquired from Kinect sensors. Approximately 8 hours of annotated measurements are provided, which are collected across two rooms from 6 participants performing 6 daily activities. This dataset can be exploited to advance WiFi and vision-based HAR, for example, using pattern recognition, skeletal representation, deep learning algorithms or other novel approaches to accurately recognize human activities. Furthermore, it can potentially be used to passively track a human in an indoor environment. Such datasets are key tools required for the development of new algorithms and methods in the context of smart homes, elderly care, and surveillance applications.Comment: 17 pages, 7 figure

    Design and integration of vision based sensors for unmanned aerial vehicles navigation and guidance

    Get PDF
    In this paper we present a novel Navigation and Guidance System (NGS) for Unmanned Aerial Vehicles (UAVs) based on Vision Based Navigation (VBN) and other avionics sensors. The main objective of our research is to design a lowcost and low-weight/volume NGS capable of providing the required level of performance in all flight phases of modern small- to medium-size UAVs, with a special focus on automated precision approach and landing, where VBN techniques can be fully exploited in a multisensory integrated architecture. Various existing techniques for VBN are compared and the Appearance-based Navigation (ABN) approach is selected for implementation

    Fuzzy cellular model for on-line traffic simulation

    Full text link
    This paper introduces a fuzzy cellular model of road traffic that was intended for on-line applications in traffic control. The presented model uses fuzzy sets theory to deal with uncertainty of both input data and simulation results. Vehicles are modelled individually, thus various classes of them can be taken into consideration. In the proposed approach, all parameters of vehicles are described by means of fuzzy numbers. The model was implemented in a simulation of vehicles queue discharge process. Changes of the queue length were analysed in this experiment and compared to the results of NaSch cellular automata model.Comment: The original publication is available at http://www.springerlink.co

    Micro-Doppler Based Human-Robot Classification Using Ensemble and Deep Learning Approaches

    Full text link
    Radar sensors can be used for analyzing the induced frequency shifts due to micro-motions in both range and velocity dimensions identified as micro-Doppler (μ\boldsymbol{\mu}-D) and micro-Range (μ\boldsymbol{\mu}-R), respectively. Different moving targets will have unique μ\boldsymbol{\mu}-D and μ\boldsymbol{\mu}-R signatures that can be used for target classification. Such classification can be used in numerous fields, such as gait recognition, safety and surveillance. In this paper, a 25 GHz FMCW Single-Input Single-Output (SISO) radar is used in industrial safety for real-time human-robot identification. Due to the real-time constraint, joint Range-Doppler (R-D) maps are directly analyzed for our classification problem. Furthermore, a comparison between the conventional classical learning approaches with handcrafted extracted features, ensemble classifiers and deep learning approaches is presented. For ensemble classifiers, restructured range and velocity profiles are passed directly to ensemble trees, such as gradient boosting and random forest without feature extraction. Finally, a Deep Convolutional Neural Network (DCNN) is used and raw R-D images are directly fed into the constructed network. DCNN shows a superior performance of 99\% accuracy in identifying humans from robots on a single R-D map.Comment: 6 pages, accepted in IEEE Radar Conference 201

    myCopter: Enabling Technologies for Personal Aerial Transportation Systems: Project status after 2.5 years

    Get PDF
    Current means of transportation for daily commuting are reaching their limits during peak travel times, which results in waste of fuel and loss of time and money. A recent study commissioned by the European Union considers a personal aerial transportation system (PATS) as a viable alternative for transportation to and from work. It also acknowledges that developing such a transportation system should not focus on designing a new flying vehicle for personal use, but instead on investigating issues surrounding the implementation of the transportation system itself. This is the aim of European project myCopter: to determine the social and technological aspects needed to set up a transportation system based on personal aerial vehicles (PAVs). The project focuses on three research areas: human-machine interfaces and training, automation technologies, and social acceptance. Our extended abstract for inclusion in the conference proceedings and our presentation will focus on the achievements during the first 2.5 years of the 4-year project. These include the development of an augmented dynamic model of a PAV with excellent handling qualities that are suitable for training purposes. The training requirements for novice pilots are currently under development. Experimental evaluations on haptic guidance and human-in-the-loop control tasks have allowed us to start implementing a haptic Highway-in-the-Sky display to support novice pilots and to investigate metrics for objectively determining workload using psychophysiological measurements. Within the project, developments for automation technologies have focused on vision-based algorithms. We have integrated such algorithms in the control and navigation architecture of unmanned aerial vehicles (UAVs). Detecting suitable landing spots from monocular camera images recorded in flight has proven to reliably work off-line, but further work is required to be able to use this approach in real time. Furthermore, we have built multiple low-cost UAVs and equipped them with radar sensors to test collision avoidance strategies in real flight. Such algorithms are currently under development and will take inspiration from crowd simulations. Finally, using technology assessment methodologies, we have assessed potential markets for PAVs and challenges for its integration into the current transportation system. This will lead to structured discussions on expectations and requirements of potential PAV users

    Spacecraft Position and Attitude Formation Control using Line-of-Sight Observations

    Full text link
    This paper studies formation control of an arbitrary number of spacecraft based on a serial network structure. The leader controls its absolute position and absolute attitude with respect to an inertial frame, and the followers control its relative position and attitude with respect to another spacecraft assigned by the serial network. The unique feature is that both the absolute attitude and the relative attitude control systems are developed directly in terms of the line-of-sight observations between spacecraft, without need for estimating the full absolute and relative attitudes, to improve accuracy and efficiency. Control systems are developed on the nonlinear configuration manifold, guaranteeing exponential stability. Numerical examples are presented to illustrate the desirable properties of the proposed control system
    • …
    corecore