20,205 research outputs found
Towards modeling complex robot training tasks through system identification
Previous research has shown that sensor-motor tasks in mobile robotics applications can be modelled automatically, using NARMAX system identi�cation, where the sensory perception of the robot is mapped to the desired motor commands using non-linear polynomial functions, resulting in a tight coupling between sensing and acting | the robot responds directly to the sensor stimuli without having internal states or memory. However, competences such as for instance sequences of actions, where actions depend on each other, require memory and thus a representation of state. In these cases a simple direct link between sensory perception and the motor commands may not be enough to accomplish the desired tasks. The contribution to knowledge of this paper is to show how fundamental, simple NARMAX
models of behaviour can be used in a bootstrapping process to generate complex behaviours that were so far beyond reach. We argue that as the complexity of the task increases, it is important to estimate the current state of the robot and integrate this information into the system identification process. To achieve this we propose a novel method which relates distinctive locations in
the environment to the state of the robot, using an unsupervised clustering algorithm. Once we estimate the current state of the robot accurately, we combine the state information with the perception of the robot through a bootstrapping method to generate more complex robot tasks: We obtain a polynomial model which models the complex task as a function of predefined low level sensor motor controllers and raw sensory data. The proposed method has been used to teach Scitos G5 mobile robots a number of complex tasks, such as advanced obstacle avoidance, or complex route learning
An application of lyapunov stability analysis to improve the performance of NARMAX models
Previously we presented a novel approach to program a robot controller based on system identification and robot training techniques. The proposed method works in two stages: first, the programmer demonstrates the desired behaviour to the robot by driving it manually in the target environment. During this run, the sensory perception and the desired velocity commands of the robot are logged. Having thus obtained training data we model the relationship between sensory readings and the motor commands of the robot using ARMAX/NARMAX models and system identification techniques. These produce linear or non-linear polynomials which can be formally analysed, as well as used in place of “traditional robot” control code.
In this paper we focus our attention on how the mathematical analysis of NARMAX models can be used to understand the robot’s control actions, to formulate hypotheses and to improve the robot’s behaviour. One main objective behind this approach is to avoid trial-and-error refinement of robot code. Instead, we seek to obtain a reliable design process, where program design decisions are
based on the mathematical analysis of the model describing how the robot interacts with its environment to achieve the desired behaviour. We demonstrate this procedure through the analysis of a particular task in mobile robotics: door traversal
Gaia Eclipsing Binary and Multiple Systems. A study of detectability and classification of eclipsing binaries with Gaia
In the new era of large-scale astronomical surveys, automated methods of
analysis and classification of bulk data are a fundamental tool for fast and
efficient production of deliverables. This becomes ever more imminent as we
enter the Gaia era. We investigate the potential detectability of eclipsing
binaries with Gaia using a data set of all Kepler eclipsing binaries sampled
with Gaia cadence and folded with the Kepler period. The performance of fitting
methods is evaluated with comparison to real Kepler data parameters and a
classification scheme is proposed for the potentially detectable sources based
on the geometry of the light curve fits. The polynomial chain (polyfit) and
two-Gaussian models are used for light curve fitting of the data set.
Classification is performed with a combination of the t-SNE (t-distrubuted
Stochastic Neighbor Embedding) and DBSCAN (Density-Based Spatial Clustering of
Applications with Noise) algorithms. We find that approximately 68% of Kepler
Eclipsing Binary sources are potentially detectable by Gaia when folded with
the Kepler period and propose a classification scheme of the detectable sources
based on the morphological type indicative of the light curve, with subclasses
that reflect the properties of the fitted model (presence and visibility of
eclipses, their width, depth, etc.).Comment: 9 pages, 18 figures, accepted for publication in Astronomy &
Astrophysic
NEMESYS: Enhanced Network Security for Seamless Service Provisioning in the Smart Mobile Ecosystem
As a consequence of the growing popularity of smart mobile devices, mobile
malware is clearly on the rise, with attackers targeting valuable user
information and exploiting vulnerabilities of the mobile ecosystems. With the
emergence of large-scale mobile botnets, smartphones can also be used to launch
attacks on mobile networks. The NEMESYS project will develop novel security
technologies for seamless service provisioning in the smart mobile ecosystem,
and improve mobile network security through better understanding of the threat
landscape. NEMESYS will gather and analyze information about the nature of
cyber-attacks targeting mobile users and the mobile network so that appropriate
counter-measures can be taken. We will develop a data collection infrastructure
that incorporates virtualized mobile honeypots and a honeyclient, to gather,
detect and provide early warning of mobile attacks and better understand the
modus operandi of cyber-criminals that target mobile devices. By correlating
the extracted information with the known patterns of attacks from wireline
networks, we will reveal and identify trends in the way that cyber-criminals
launch attacks against mobile devices.Comment: Accepted for publication in Proceedings of the 28th International
Symposium on Computer and Information Sciences (ISCIS'13); 9 pages; 1 figur
Craquelure as a Graph: Application of Image Processing and Graph Neural Networks to the Description of Fracture Patterns
Cracks on a painting is not a defect but an inimitable signature of an
artwork which can be used for origin examination, aging monitoring, damage
identification, and even forgery detection. This work presents the development
of a new methodology and corresponding toolbox for the extraction and
characterization of information from an image of a craquelure pattern.
The proposed approach processes craquelure network as a graph. The graph
representation captures the network structure via mutual organization of
junctions and fractures. Furthermore, it is invariant to any geometrical
distortions. At the same time, our tool extracts the properties of each node
and edge individually, which allows to characterize the pattern statistically.
We illustrate benefits from the graph representation and statistical features
individually using novel Graph Neural Network and hand-crafted descriptors
correspondingly. However, we also show that the best performance is achieved
when both techniques are merged into one framework. We perform experiments on
the dataset for paintings' origin classification and demonstrate that our
approach outperforms existing techniques by a large margin.Comment: Published in ICCV 2019 Workshop
Bayesian changepoint analysis for atomic force microscopy and soft material indentation
Material indentation studies, in which a probe is brought into controlled
physical contact with an experimental sample, have long been a primary means by
which scientists characterize the mechanical properties of materials. More
recently, the advent of atomic force microscopy, which operates on the same
fundamental principle, has in turn revolutionized the nanoscale analysis of
soft biomaterials such as cells and tissues. This paper addresses the
inferential problems associated with material indentation and atomic force
microscopy, through a framework for the changepoint analysis of pre- and
post-contact data that is applicable to experiments across a variety of
physical scales. A hierarchical Bayesian model is proposed to account for
experimentally observed changepoint smoothness constraints and measurement
error variability, with efficient Monte Carlo methods developed and employed to
realize inference via posterior sampling for parameters such as Young's
modulus, a key quantifier of material stiffness. These results are the first to
provide the materials science community with rigorous inference procedures and
uncertainty quantification, via optimized and fully automated high-throughput
algorithms, implemented as the publicly available software package BayesCP. To
demonstrate the consistent accuracy and wide applicability of this approach,
results are shown for a variety of data sets from both macro- and
micro-materials experiments--including silicone, neurons, and red blood
cells--conducted by the authors and others.Comment: 20 pages, 6 figures; submitted for publicatio
- …