1,983 research outputs found
New Generation of Instrumented Ranges: Enabling Automated Performance Analysis
Military training conducted on physical ranges that match a unit’s future operational environment provides
an invaluable experience. Today, to conduct a training exercise while ensuring a unit’s performance is
closely observed, evaluated, and reported on in an After Action Review, the unit requires a number of
instructors to accompany the different elements. Training organized on ranges for urban warfighting brings
an additional level of complexity—the high level of occlusion typical for these environments multiplies the
number of evaluators needed. While the units have great need for such training opportunities, they may not
have the necessary human resources to conduct them successfully. In this paper we report on our US
Navy/ONR-sponsored project aimed at a new generation of instrumented ranges, and the early results we
have achieved. We suggest a radically different concept: instead of recording multiple video streams that
need to be reviewed and evaluated by a number of instructors, our system will focus on capturing dynamic
individual warfighter pose data and performing automated performance evaluation. We will use an in situ
network of automatically-controlled pan-tilt-zoom video cameras and personal position and orientation
sensing devices. Our system will record video, reconstruct dynamic 3D individual poses, analyze,
recognize events, evaluate performances, generate reports, provide real-time free exploration of recorded
data, and even allow the user to generate ‘what-if’ scenarios that were never recorded. The most direct
benefit for an individual unit will be the ability to conduct training with fewer human resources, while
having a more quantitative account of their performance (dispersion across the terrain, ‘weapon flagging’
incidents, number of patrols conducted). The instructors will have immediate feedback on some elements
of the unit’s performance. Having data sets for multiple units will enable historical trend analysis, thus
providing new insights and benefits for the entire service.Office of Naval Researc
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
Proceedings of the 2009 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory
The joint workshop of the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB, Karlsruhe, and the Vision and Fusion Laboratory (Institute for Anthropomatics, Karlsruhe Institute of Technology (KIT)), is organized annually since 2005 with the aim to report on the latest research and development findings of the doctoral students of both institutions. This book provides a collection of 16 technical reports on the research results presented on the 2009 workshop
Proceedings of the 2011 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory
This book is a collection of 15 reviewed technical reports summarizing the presentations at the 2011 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory. The covered topics include image processing, optical signal processing, visual inspection, pattern recognition and classification, human-machine interaction, world and situation modeling, autonomous system localization and mapping, information fusion, and trust propagation in sensor networks
- …