278 research outputs found
A new integration algorithm for ordinary differential equations based on continued fraction approximations
A new integration algorithm is found, and an implementation is compared with other programmed algorithms. The new algorithm is a step by step procedure for solving the initial value problem in ordinary differential equations. It is designed to approximate poles of small integer order in the solutions of the differential equations by continued fractions obtained by manipulating the sums of truncated Taylor series expansions. The new method is compared with the Gragg- Bulirsch-Stoer, and the Taylor series method. The Taylor series method and the new method are shown to be superior in speed and accuracy, while the new method is shown to be most superior when the solution is required near a singularity. The new method can finally be seen to pass automatically through singularities where all the other methods which are discussed will have failed
Database independent Migration of Objects into an Object-Relational Database
This paper reports on the CERN-based WISDOM project which is studying the
serialisation and deserialisation of data to/from an object database
(objectivity) and ORACLE 9i.Comment: 26 pages, 18 figures; CMS CERN Conference Report cr02_01
Design Patterns for Description-Driven Systems
In data modelling, product information has most often been handled separately
from process information. The integration of product and process models in a
unified data model could provide the means by which information could be shared
across an enterprise throughout the system lifecycle from design through to
production. Recently attempts have been made to integrate these two separate
views of systems through identifying common data models. This paper relates
description-driven systems to multi-layer architectures and reveals where
existing design patterns facilitate the integration of product and process
models and where patterns are missing or where existing patterns require
enrichment for this integration. It reports on the construction of a so-called
description-driven system which integrates Product Data Management (PDM) and
Workflow Management (WfM) data models through a common meta-model.Comment: 14 pages, 13 figures. Presented at the 3rd Enterprise Distributed
Object Computing EDOC'99 conference. Mannheim, Germany. September 199
Mobile Computing in Physics Analysis - An Indicator for eScience
This paper presents the design and implementation of a Grid-enabled physics
analysis environment for handheld and other resource-limited computing devices
as one example of the use of mobile devices in eScience. Handheld devices offer
great potential because they provide ubiquitous access to data and
round-the-clock connectivity over wireless links. Our solution aims to provide
users of handheld devices the capability to launch heavy computational tasks on
computational and data Grids, monitor the jobs status during execution, and
retrieve results after job completion. Users carry their jobs on their handheld
devices in the form of executables (and associated libraries). Users can
transparently view the status of their jobs and get back their outputs without
having to know where they are being executed. In this way, our system is able
to act as a high-throughput computing environment where devices ranging from
powerful desktop machines to small handhelds can employ the power of the Grid.
The results shown in this paper are readily applicable to the wider eScience
community.Comment: 8 pages, 7 figures. Presented at the 3rd Int Conf on Mobile Computing
& Ubiquitous Networking (ICMU06. London October 200
Pion form factor in the Kroll-Lee-Zumino model
The renormalizable Abelian quantum field theory model of Kroll, Lee, and
Zumino is used to compute the one-loop vertex corrections to the tree-level,
Vector Meson Dominance (VMD) pion form factor. These corrections, together with
the known one-loop vacuum polarization contribution, lead to a substantial
improvement over VMD. The resulting pion form factor in the space-like region
is in excellent agreement with data in the whole range of accessible momentum
transfers. The time-like form factor, known to reproduce the Gounaris-Sakurai
formula at and near the rho-meson peak, is unaffected by the vertex correction
at order (g_\rpp^2).Comment: Revised version corrects a misprint in Eq.(1
Status and perspective of detector databases in the CMS experiment at the LHC
This note gives an overview at a high conceptual level of the various databases that capture the information concerning the CMS detector. The detector domain has been split up into four, partly overlapping parts that cover phases in the detector life cycle: construction, integration, configuration and condition, and a geometry part that is common to all phases. The discussion addresses the specific content and usage of each part, and further requirements, dependencies and interfaces
DIANA Scheduling Hierarchies for Optimizing Bulk Job Scheduling
The use of meta-schedulers for resource management in large-scale distributed
systems often leads to a hierarchy of schedulers. In this paper, we discuss why
existing meta-scheduling hierarchies are sometimes not sufficient for Grid
systems due to their inability to re-organise jobs already scheduled locally.
Such a job re-organisation is required to adapt to evolving loads which are
common in heavily used Grid infrastructures. We propose a peer-to-peer
scheduling model and evaluate it using case studies and mathematical modelling.
We detail the DIANA (Data Intensive and Network Aware) scheduling algorithm and
its queue management system for coping with the load distribution and for
supporting bulk job scheduling. We demonstrate that such a system is beneficial
for dynamic, distributed and self-organizing resource management and can assist
in optimizing load or job distribution in complex Grid infrastructures.Comment: 8 pages, 9 figures. Presented at the 2nd IEEE Int Conference on
eScience & Grid Computing. Amsterdam Netherlands, December 200
Object Serialization and Deserialization Using XML
Interoperability of potentially heterogeneous databases has been an ongoing
research issue for a number of years in the database community. With the trend
towards globalization of data location and data access and the consequent
requirement for the coexistence of new data stores with legacy systems, the
cooperation and data interchange between data repositories has become
increasingly important. The emergence of the eXtensible Markup Language (XML)
as a database independent representation for data offers a suitable mechanism
for transporting data between repositories. This paper describes a research
activity within a group at CERN (called CMS) towards identifying and
implementing database serialization and deserialization methods that can be
used to replicate or migrate objects across the network between CERN and
worldwide centres using XML to serialize the contents of multiple objects
resident in object-oriented databases.Comment: 14 pages 7 figure
Surface characterization of p-type point contact germanium detectors
P-type point contact (PPC) germanium detectors are used in rare event and
low-background searches, including neutrinoless double beta (0vbb) decay,
low-energy nuclear recoils, and coherent elastic neutrino-nucleus scattering.
The detectors feature an excellent energy resolution, low detection thresholds
down to the sub-keV range, and enhanced background rejection capabilities.
However, due to their large passivated surface, separating the signal readout
contact from the bias voltage electrode, PPC detectors are susceptible to
surface effects such as charge build-up. A profound understanding of their
response to surface events is essential. In this work, the response of a PPC
detector to alpha and beta particles hitting the passivated surface was
investigated in a multi-purpose scanning test stand. It is shown that the
passivated surface can accumulate charges resulting in a radial-dependent
degradation of the observed event energy. In addition, it is demonstrated that
the pulse shapes of surface alpha events show characteristic features which can
be used to discriminate against these events
Results from 730 kg days of the CRESST-II Dark Matter Search
The CRESST-II cryogenic Dark Matter search, aiming at detection of WIMPs via
elastic scattering off nuclei in CaWO crystals, completed 730 kg days of
data taking in 2011. We present the data collected with eight detector modules,
each with a two-channel readout; one for a phonon signal and the other for
coincidently produced scintillation light. The former provides a precise
measure of the energy deposited by an interaction, and the ratio of
scintillation light to deposited energy can be used to discriminate different
types of interacting particles and thus to distinguish possible signal events
from the dominant backgrounds. Sixty-seven events are found in the acceptance
region where a WIMP signal in the form of low energy nuclear recoils would be
expected. We estimate background contributions to this observation from four
sources: 1) "leakage" from the e/\gamma-band 2) "leakage" from the
\alpha-particle band 3) neutrons and 4) Pb-206 recoils from Po-210 decay. Using
a maximum likelihood analysis, we find, at a high statistical significance,
that these sources alone are not sufficient to explain the data. The addition
of a signal due to scattering of relatively light WIMPs could account for this
discrepancy, and we determine the associated WIMP parameters.Comment: 17 pages, 13 figure
- …