6,127 research outputs found

    Hybrid Focal Stereo Networks for Pattern Analysis in Homogeneous Scenes

    Full text link
    In this paper we address the problem of multiple camera calibration in the presence of a homogeneous scene, and without the possibility of employing calibration object based methods. The proposed solution exploits salient features present in a larger field of view, but instead of employing active vision we replace the cameras with stereo rigs featuring a long focal analysis camera, as well as a short focal registration camera. Thus, we are able to propose an accurate solution which does not require intrinsic variation models as in the case of zooming cameras. Moreover, the availability of the two views simultaneously in each rig allows for pose re-estimation between rigs as often as necessary. The algorithm has been successfully validated in an indoor setting, as well as on a difficult scene featuring a highly dense pilgrim crowd in Makkah.Comment: 13 pages, 6 figures, submitted to Machine Vision and Application

    Small unmanned airborne systems to support oil and gas pipeline monitoring and mapping

    Get PDF
    Acknowledgments We thank Johan Havelaar, Aeryon Labs Inc., AeronVironment Inc. and Aeronautics Inc. for kindly permitting the use of materials in Fig. 1.Peer reviewedPublisher PD

    SALSA: A Novel Dataset for Multimodal Group Behavior Analysis

    Get PDF
    Studying free-standing conversational groups (FCGs) in unstructured social settings (e.g., cocktail party ) is gratifying due to the wealth of information available at the group (mining social networks) and individual (recognizing native behavioral and personality traits) levels. However, analyzing social scenes involving FCGs is also highly challenging due to the difficulty in extracting behavioral cues such as target locations, their speaking activity and head/body pose due to crowdedness and presence of extreme occlusions. To this end, we propose SALSA, a novel dataset facilitating multimodal and Synergetic sociAL Scene Analysis, and make two main contributions to research on automated social interaction analysis: (1) SALSA records social interactions among 18 participants in a natural, indoor environment for over 60 minutes, under the poster presentation and cocktail party contexts presenting difficulties in the form of low-resolution images, lighting variations, numerous occlusions, reverberations and interfering sound sources; (2) To alleviate these problems we facilitate multimodal analysis by recording the social interplay using four static surveillance cameras and sociometric badges worn by each participant, comprising the microphone, accelerometer, bluetooth and infrared sensors. In addition to raw data, we also provide annotations concerning individuals' personality as well as their position, head, body orientation and F-formation information over the entire event duration. Through extensive experiments with state-of-the-art approaches, we show (a) the limitations of current methods and (b) how the recorded multiple cues synergetically aid automatic analysis of social interactions. SALSA is available at http://tev.fbk.eu/salsa.Comment: 14 pages, 11 figure

    Innovative Solutions for Navigation and Mission Management of Unmanned Aircraft Systems

    Get PDF
    The last decades have witnessed a significant increase in Unmanned Aircraft Systems (UAS) of all shapes and sizes. UAS are finding many new applications in supporting several human activities, offering solutions to many dirty, dull, and dangerous missions, carried out by military and civilian users. However, limited access to the airspace is the principal barrier to the realization of the full potential that can be derived from UAS capabilities. The aim of this thesis is to support the safe integration of UAS operations, taking into account both the user's requirements and flight regulations. The main technical and operational issues, considered among the principal inhibitors to the integration and wide-spread acceptance of UAS, are identified and two solutions for safe UAS operations are proposed: A. Improving navigation performance of UAS by exploiting low-cost sensors. To enhance the performance of the low-cost and light-weight integrated navigation system based on Global Navigation Satellite System (GNSS) and Micro Electro-Mechanical Systems (MEMS) inertial sensors, an efficient calibration method for MEMS inertial sensors is required. Two solutions are proposed: 1) The innovative Thermal Compensated Zero Velocity Update (TCZUPT) filter, which embeds the compensation of thermal effect on bias in the filter itself and uses Back-Propagation Neural Networks to build the calibration function. Experimental results show that the TCZUPT filter is faster than the traditional ZUPT filter in mapping significant bias variations and presents better performance in the overall testing period. Moreover, no calibration pre-processing stage is required to keep measurement drift under control, improving the accuracy, reliability, and maintainability of the processing software; 2) A redundant configuration of consumer grade inertial sensors to obtain a self-calibration of typical inertial sensors biases. The result is a significant reduction of uncertainty in attitude determination. In conclusion, both methods improve dead-reckoning performance for handling intermittent GNSS coverage. B. Proposing novel solutions for mission management to support the Unmanned Traffic Management (UTM) system in monitoring and coordinating the operations of a large number of UAS. Two solutions are proposed: 1) A trajectory prediction tool for small UAS, based on Learning Vector Quantization (LVQ) Neural Networks. By exploiting flight data collected when the UAS executes a pre-assigned flight path, the tool is able to predict the time taken to fly generic trajectory elements. Moreover, being self-adaptive in constructing a mathematical model, LVQ Neural Networks allow creating different models for the different UAS types in several environmental conditions; 2) A software tool aimed at supporting standardized procedures for decision-making process to identify UAS/payload configurations suitable for any type of mission that can be authorized standing flight regulations. The proposed methods improve the management and safe operation of large-scale UAS missions, speeding up the flight authorization process by the UTM system and supporting the increasing level of autonomy in UAS operations

    A Line-Of-Slight Sensor Network for Wide Area Video Surveillance: Simulation and Evaluation

    Get PDF
    Substantial performance improvement of a wide area video surveillance network can be obtained with the addition of a Line-of-Sight sensor. The research described in this thesis shows that while the Line-of-Sight sensor cannot monitor areas with the ubiquity of video cameras alone, the combined network produces substantially fewer false alarms and superior location precision for numerous moving people than video. Recent progress in the fabrication of inexpensive, robust CMOS-based video cameras have triggered a new approach to wide area surveillance of busy areas such as modeling an airport corridor as a distributed sensor network problem. Wireless communication between these cameras and other sensors make it more practical to deploy them in an arbitrary spatial configuration to unobtrusively monitor cooperative and non-cooperative people. The computation and communication to establish image registration between the cameras grows rapidly as the number of cameras increases. Computation is required to detect people in each image, establish a correspondence between people in two or more images, compute exact 3-D positions from each corresponding pair, temporally track targets in space and time, and assimilate resultant data until thresholds have been reached to either cause an alarm or abandon further monitoring of that person. Substantial improvement can be obtained with the addition of a Line-of-Sight sensor as a location detection system to decoupling the detection, localization, and identification subtasks. That is, if the where can be answered by a location detection system, the what can be addressed by the video most effectively

    Proceedings of the 2011 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory

    Get PDF
    This book is a collection of 15 reviewed technical reports summarizing the presentations at the 2011 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory. The covered topics include image processing, optical signal processing, visual inspection, pattern recognition and classification, human-machine interaction, world and situation modeling, autonomous system localization and mapping, information fusion, and trust propagation in sensor networks

    Exploring space situational awareness using neuromorphic event-based cameras

    Get PDF
    The orbits around earth are a limited natural resource and one that hosts a vast range of vital space-based systems that support international systems use by both commercial industries, civil organisations, and national defence. The availability of this space resource is rapidly depleting due to the ever-growing presence of space debris and rampant overcrowding, especially in the limited and highly desirable slots in geosynchronous orbit. The field of Space Situational Awareness encompasses tasks aimed at mitigating these hazards to on-orbit systems through the monitoring of satellite traffic. Essential to this task is the collection of accurate and timely observation data. This thesis explores the use of a novel sensor paradigm to optically collect and process sensor data to enhance and improve space situational awareness tasks. Solving this issue is critical to ensure that we can continue to utilise the space environment in a sustainable way. However, these tasks pose significant engineering challenges that involve the detection and characterisation of faint, highly distant, and high-speed targets. Recent advances in neuromorphic engineering have led to the availability of high-quality neuromorphic event-based cameras that provide a promising alternative to the conventional cameras used in space imaging. These cameras offer the potential to improve the capabilities of existing space tracking systems and have been shown to detect and track satellites or ‘Resident Space Objects’ at low data rates, high temporal resolutions, and in conditions typically unsuitable for conventional optical cameras. This thesis presents a thorough exploration of neuromorphic event-based cameras for space situational awareness tasks and establishes a rigorous foundation for event-based space imaging. The work conducted in this project demonstrates how to enable event-based space imaging systems that serve the goals of space situational awareness by providing accurate and timely information on the space domain. By developing and implementing event-based processing techniques, the asynchronous operation, high temporal resolution, and dynamic range of these novel sensors are leveraged to provide low latency target acquisition and rapid reaction to challenging satellite tracking scenarios. The algorithms and experiments developed in this thesis successfully study the properties and trade-offs of event-based space imaging and provide comparisons with traditional observing methods and conventional frame-based sensors. The outcomes of this thesis demonstrate the viability of event-based cameras for use in tracking and space imaging tasks and therefore contribute to the growing efforts of the international space situational awareness community and the development of the event-based technology in astronomy and space science applications
    corecore