10,379 research outputs found
High Energy Cosmic Neutrinos
While the general principles of high-energy neutrino detection have been
understood for many years, the deep, remote geographical locations of suitable
detector sites have challenged the ingenuity of experimentalists, who have
confronted unusual deployment, calibration, and robustness issues. Two high
energy neutrino programs are now operating (Baikal and AMANDA), with the
expectation of ushering in an era of multi-messenger astronomy, and two
Mediterranean programs have made impressive progress. The detectors are
optimized to detect neutrinos with energies of the order of 1-10 TeV, although
they are capable of detecting neutrinos with energies of tens of MeV to greater
than PeV. This paper outlines the interdisciplinary scientific agenda, which
span the fields of astronomy, particle physics, and cosmic ray physics, and
describes ongoing worldwide experimental programs to realize these goals.Comment: 15 pages, 9 figures, talk presented at the Nobel Symposium on
Particle Physics and the Universe, Sweden, August 199
Performance and analysis of feature tracking approaches in laser speckle instrumentation
This paper investigates the application of feature tracking algorithms as an alternative data processing method for laser speckle instrumentation. The approach is capable of determining both the speckle pattern translation and rotation and can therefore be used to detect the in-plane rotation and translation of an object simultaneously. A performance assessment of widely used feature detection and matching algorithms from the computer vision field, for both translation and rotation measurements from laser speckle patterns, is presented. The accuracy of translation measurements using the feature tracking approach was found to be similar to that of correlation-based processing with accuracies of 0.025–0.04 pixels and a typical precision of 0.02–0.09 pixels depending upon the method and image size used. The performance for in-plane rotation measurements are also presented with rotation measurement accuracies of <0.01 found to be achievable over an angle range of ±10 and of <0.1 over a range of ±25 ±25 , with a typical precision between 0.02 and 0.08 depending upon method and image size. The measurement range is found to be limited by the failure to match sufficient speckles at larger rotation angles. An analysis of each stage of the process was conducted to identify the most suitable approaches for use with laser speckle images and areas requiring further improvement. A quantitative approach to assessing different feature tracking methods is described, and reference data sets of experimentally translated and rotated speckle patterns from a range of surface finishes and surface roughness are presented. As a result, three areas that lead to the failure of the matching process are identified as areas for future investigation: the inability to detect the same features in partially decorrelated images leading to unmatchable features, the variance of computed feature orientation between frames leading to different descriptors being calculated for the same feature, and the failure of the matching processes due to the inability to discriminate between different features in speckle images
Beam-Induced Damage Mechanisms and their Calculation
The rapid interaction of highly energetic particle beams with matter induces
dynamic responses in the impacted component. If the beam pulse is sufficiently
intense, extreme conditions can be reached, such as very high pressures,
changes of material density, phase transitions, intense stress waves, material
fragmentation and explosions. Even at lower intensities and longer time-scales,
significant effects may be induced, such as vibrations, large oscillations, and
permanent deformation of the impacted components. These lectures provide an
introduction to the mechanisms that govern the thermomechanical phenomena
induced by the interaction between particle beams and solids and to the
analytical and numerical methods that are available for assessing the response
of impacted components. An overview of the design principles of such devices is
also provided, along with descriptions of material selection guidelines and the
experimental tests that are required to validate materials and components
exposed to interactions with energetic particle beams.Comment: 69 pages, contribution to the 2014 Joint International Accelerator
School: Beam Loss and Accelerator Protection, Newport Beach, CA, USA , 5-14
Nov 201
Data-Driven Architecture to Increase Resilience In Multi-Agent Coordinated Missions
The rise in the use of Multi-Agent Systems (MASs) in unpredictable and changing environments has created the need for intelligent algorithms to increase their autonomy, safety and performance in the event of disturbances and threats. MASs are attractive for their flexibility, which also makes them prone to threats that may result from hardware failures (actuators, sensors, onboard computer, power source) and operational abnormal conditions (weather, GPS denied location, cyber-attacks). This dissertation presents research on a bio-inspired approach for resilience augmentation in MASs in the presence of disturbances and threats such as communication link and stealthy zero-dynamics attacks. An adaptive bio-inspired architecture is developed for distributed consensus algorithms to increase fault-tolerance in a network of multiple high-order nonlinear systems under directed fixed topologies. In similarity with the natural organisms’ ability to recognize and remember specific pathogens to generate its immunity, the immunity-based architecture consists of a Distributed Model-Reference Adaptive Control (DMRAC) with an Artificial Immune System (AIS) adaptation law integrated within a consensus protocol. Feedback linearization is used to modify the high-order nonlinear model into four decoupled linear subsystems. A stability proof of the adaptation law is conducted using Lyapunov methods and Jordan decomposition. The DMRAC is proven to be stable in the presence of external time-varying bounded disturbances and the tracking error trajectories are shown to be bounded. The effectiveness of the proposed architecture is examined through numerical simulations. The proposed controller successfully ensures that consensus is achieved among all agents while the adaptive law v simultaneously rejects the disturbances in the agent and its neighbors. The architecture also includes a health management system to detect faulty agents within the global network. Further numerical simulations successfully test and show that the Global Health Monitoring (GHM) does effectively detect faults within the network
The Dark Energy Survey Data Management System
The Dark Energy Survey collaboration will study cosmic acceleration with a
5000 deg2 griZY survey in the southern sky over 525 nights from 2011-2016. The
DES data management (DESDM) system will be used to process and archive these
data and the resulting science ready data products. The DESDM system consists
of an integrated archive, a processing framework, an ensemble of astronomy
codes and a data access framework. We are developing the DESDM system for
operation in the high performance computing (HPC) environments at NCSA and
Fermilab. Operating the DESDM system in an HPC environment offers both speed
and flexibility. We will employ it for our regular nightly processing needs,
and for more compute-intensive tasks such as large scale image coaddition
campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available
to the Collaboration and later to the public through a virtual-observatory
compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project,
which must deploy and maintain only the storage, database platforms and
orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we
tested the current DESDM system on both simulated and real survey data. We used
Teragrid to process 10 simulated DES nights (3TB of raw data), ingesting and
calibrating approximately 250 million objects into the DES Archive database. We
also used DESDM to process and calibrate over 50 nights of survey data acquired
with the Mosaic2 camera. Comparison to truth tables in the case of the
simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.Comment: To be published in the proceedings of the SPIE conference on
Astronomical Instrumentation (held in Marseille in June 2008). This preprint
is made available with the permission of SPIE. Further information together
with preprint containing full quality images is available at
http://desweb.cosmology.uiuc.edu/wik
- …