46,030 research outputs found
Dynamic Motion Modelling for Legged Robots
An accurate motion model is an important component in modern-day robotic
systems, but building such a model for a complex system often requires an
appreciable amount of manual effort. In this paper we present a motion model
representation, the Dynamic Gaussian Mixture Model (DGMM), that alleviates the
need to manually design the form of a motion model, and provides a direct means
of incorporating auxiliary sensory data into the model. This representation and
its accompanying algorithms are validated experimentally using an 8-legged
kinematically complex robot, as well as a standard benchmark dataset. The
presented method not only learns the robot's motion model, but also improves
the model's accuracy by incorporating information about the terrain surrounding
the robot
Look No Further: Adapting the Localization Sensory Window to the Temporal Characteristics of the Environment
Many localization algorithms use a spatiotemporal window of sensory
information in order to recognize spatial locations, and the length of this
window is often a sensitive parameter that must be tuned to the specifics of
the application. This letter presents a general method for environment-driven
variation of the length of the spatiotemporal window based on searching for the
most significant localization hypothesis, to use as much context as is
appropriate but not more. We evaluate this approach on benchmark datasets using
visual and Wi-Fi sensor modalities and a variety of sensory comparison
front-ends under in-order and out-of-order traversals of the environment. Our
results show that the system greatly reduces the maximum distance traveled
without localization compared to a fixed-length approach while achieving
competitive localization accuracy, and our proposed method achieves this
performance without deployment-time tuning.Comment: Pre-print of article appearing in 2017 IEEE Robotics and Automation
Letters. v2: incorporated reviewer feedbac
Transfer Learning for Device Fingerprinting with Application to Cognitive Radio Networks
Primary user emulation (PUE) attacks are an emerging threat to cognitive
radio (CR) networks in which malicious users imitate the primary users (PUs)
signals to limit the access of secondary users (SUs). Ascertaining the identity
of the devices is a key technical challenge that must be overcome to thwart the
threat of PUE attacks. Typically, detection of PUE attacks is done by
inspecting the signals coming from all the devices in the system, and then
using these signals to form unique fingerprints for each device. Current
detection and fingerprinting approaches require certain conditions to hold in
order to effectively detect attackers. Such conditions include the need for a
sufficient amount of fingerprint data for users or the existence of both the
attacker and the victim PU within the same time frame. These conditions are
necessary because current methods lack the ability to learn the behavior of
both SUs and PUs with time. In this paper, a novel transfer learning (TL)
approach is proposed, in which abstract knowledge about PUs and SUs is
transferred from past time frames to improve the detection process at future
time frames. The proposed approach extracts a high level representation for the
environment at every time frame. This high level information is accumulated to
form an abstract knowledge database. The CR system then utilizes this database
to accurately detect PUE attacks even if an insufficient amount of fingerprint
data is available at the current time frame. The dynamic structure of the
proposed approach uses the final detection decisions to update the abstract
knowledge database for future runs. Simulation results show that the proposed
method can improve the performance with an average of 3.5% for only 10%
relevant information between the past knowledge and the current environment
signals.Comment: 6 pages, 3 figures, in Proceedings of IEEE 26th International
Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Hong
Kong, P.R. China, Aug. 201
- …