26,320 research outputs found
The POOL Data Storage, Cache and Conversion Mechanism
The POOL data storage mechanism is intended to satisfy the needs of the LHC
experiments to store and analyze the data from the detector response of
particle collisions at the LHC proton-proton collider. Both the data rate and
the data volumes will largely differ from the past experience. The POOL data
storage mechanism is intended to be able to cope with the experiment's
requirements applying a flexible multi technology data persistency mechanism.
The developed technology independent approach is flexible enough to adopt new
technologies, take advantage of existing schema evolution mechanisms and allows
users to access data in a technology independent way. The framework consists of
several components, which can be individually adopted and integrated into
existing experiment frameworks.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 5 pages, PDF, 6 figures. PSN MOKT00
Redundant neural vision systems: competing for collision recognition roles
Ability to detect collisions is vital for future robots that interact with humans in complex visual environments. Lobula giant movement detectors (LGMD) and directional selective neurons (DSNs) are two types of identified neurons found in the visual pathways of insects such as locusts. Recent modelling studies showed that the LGMD or grouped DSNs could each be tuned for collision recognition. In both biological and artificial vision systems, however, which one should play the collision recognition role and the way the two types of specialized visual neurons could be functioning together are not clear. In this modeling study, we compared the competence of the LGMD and the DSNs, and also investigate the cooperation of the two neural vision systems for collision recognition via artificial evolution. We implemented three types of collision recognition neural subsystems – the LGMD, the DSNs and a hybrid system which combines the LGMD and the DSNs subsystems together, in each individual agent. A switch gene determines which of the three redundant neural subsystems plays the collision recognition role. We found that, in both robotics and driving environments, the LGMD was able to build up its ability for collision recognition quickly and robustly therefore reducing the chance of other types of neural networks to play the same role. The results suggest that the LGMD neural network could be the ideal model to be realized in hardware for collision recognition
The Astrophysical Multipurpose Software Environment
We present the open source Astrophysical Multi-purpose Software Environment
(AMUSE, www.amusecode.org), a component library for performing astrophysical
simulations involving different physical domains and scales. It couples
existing codes within a Python framework based on a communication layer using
MPI. The interfaces are standardized for each domain and their implementation
based on MPI guarantees that the whole framework is well-suited for distributed
computation. It includes facilities for unit handling and data storage.
Currently it includes codes for gravitational dynamics, stellar evolution,
hydrodynamics and radiative transfer. Within each domain the interfaces to the
codes are as similar as possible. We describe the design and implementation of
AMUSE, as well as the main components and community codes currently supported
and we discuss the code interactions facilitated by the framework.
Additionally, we demonstrate how AMUSE can be used to resolve complex
astrophysical problems by presenting example applications.Comment: 23 pages, 25 figures, accepted for A&
Two-Timescale Learning Using Idiotypic Behaviour Mediation For A Navigating Mobile Robot
A combined Short-Term Learning (STL) and Long-Term Learning (LTL) approach to
solving mobile-robot navigation problems is presented and tested in both the
real and virtual domains. The LTL phase consists of rapid simulations that use
a Genetic Algorithm to derive diverse sets of behaviours, encoded as variable
sets of attributes, and the STL phase is an idiotypic Artificial Immune System.
Results from the LTL phase show that sets of behaviours develop very rapidly,
and significantly greater diversity is obtained when multiple autonomous
populations are used, rather than a single one. The architecture is assessed
under various scenarios, including removal of the LTL phase and switching off
the idiotypic mechanism in the STL phase. The comparisons provide substantial
evidence that the best option is the inclusion of both the LTL phase and the
idiotypic system. In addition, this paper shows that structurally different
environments can be used for the two phases without compromising
transferability.Comment: 40 pages, 12 tables, Journal of Applied Soft Computin
Evidence for Hydrodynamic Evolution in Proton-Proton Scattering at LHC Energies
In scattering at LHC energies, large numbers of elementary scatterings
will contribute significantly, and the corresponding high multiplicity events
will be of particular interest. Elementary scatterings are parton ladders,
identified with color flux-tubes. In high multiplicity events, many of these
flux tubes are produced in the same space region, creating high energy
densities. We argue that there are good reasons to employ the successful
procedure used for heavy ion collisions: matter is assumed to thermalizes
quickly, such that the energy from the flux-tubes can be taken as initial
condition for a hydrodynamic expansion. This scenario gets spectacular support
from very recent results on Bose-Einstein correlations in scattering at
900 GeV at LHC.Comment: 11 pages, 20 figure
Star cluster ecology IVa: Dissection of an open star cluster---photometry
The evolution of star clusters is studied using N-body simulations in which
the evolution of single stars and binaries are taken self-consistently into
account. Initial conditions are chosen to represent relatively young Galactic
open clusters, such as the Pleiades, Praesepe and the Hyades. The calculations
include a realistic mass function, primordial binaries and the external
potential of the parent Galaxy. Our model clusters are generally significantly
flattened in the Galactic tidal field, and dissolve before deep core collapse
occurs. The binary fraction decreases initially due to the destruction of soft
binaries, but increases later because lower mass single stars escape more
easily than the more massive binaries. At late times, the cluster core is quite
rich in giants and white dwarfs. There is no evidence for preferential
evaporation of old white dwarfs, on the contrary the formed white dwarfs are
likely to remain in the cluster. Stars tend to escape from the cluster through
the first and second Lagrange points, in the direction of and away from the
Galactic center. Mass segregation manifests itself in our models well within an
initial relaxation time. As expected, giants and white dwarfs are much more
strongly affected by mass segregation than main-sequence stars. Open clusters
are dynamically rather inactive. However, the combined effect of stellar mass
loss and evaporation of stars from the cluster potential drives its dissolution
on a much shorter timescale than if these effects are neglected. The often-used
argument that a star cluster is barely older than its relaxation time and
therefore cannot be dynamically evolved is clearly in error for the majority of
star clusters.Comment: reduced abstract, 33 pages (three separate color .jpg figures),
submitted to MNRA
SlowFuzz: Automated Domain-Independent Detection of Algorithmic Complexity Vulnerabilities
Algorithmic complexity vulnerabilities occur when the worst-case time/space
complexity of an application is significantly higher than the respective
average case for particular user-controlled inputs. When such conditions are
met, an attacker can launch Denial-of-Service attacks against a vulnerable
application by providing inputs that trigger the worst-case behavior. Such
attacks have been known to have serious effects on production systems, take
down entire websites, or lead to bypasses of Web Application Firewalls.
Unfortunately, existing detection mechanisms for algorithmic complexity
vulnerabilities are domain-specific and often require significant manual
effort. In this paper, we design, implement, and evaluate SlowFuzz, a
domain-independent framework for automatically finding algorithmic complexity
vulnerabilities. SlowFuzz automatically finds inputs that trigger worst-case
algorithmic behavior in the tested binary. SlowFuzz uses resource-usage-guided
evolutionary search techniques to automatically find inputs that maximize
computational resource utilization for a given application.Comment: ACM CCS '17, October 30-November 3, 2017, Dallas, TX, US
Evolution of Prehension Ability in an Anthropomorphic Neurorobotic Arm
In this paper we show how a simulated anthropomorphic robotic arm controlled by an artificial neural network can develop effective reaching and grasping behaviour through a trial and error process in which the free parameters encode the control rules which regulate the fine-grained interaction between the robot and the environment and variations of the free parameters are retained or discarded on the basis of their effects at the level of the global behaviour exhibited by the robot situated in the environment. The obtained results demonstrate how the proposed methodology allows the robot to produce effective behaviours thanks to its ability to exploit the morphological properties of the robot’s body (i.e. its anthropomorphic shape, the elastic properties of its muscle-like actuators, and the compliance of its actuated joints) and the properties which arise from the physical interaction between the robot and the environment mediated by appropriate control rules
- …