9,014 research outputs found
Time resolution below 100 ps for the SciTil detector of PANDA employing SiPM
The barrel time-of-flight (TOF) detector for the PANDA experiment at FAIR in
Darmstadt is planned as a scintillator tile hodoscope (SciTil) using 8000 small
scintillator tiles. It will provide fast event timing for a software trigger in
the otherwise trigger-less data acquisition scheme of PANDA, relative timing in
a multiple track event topology as well as additional particle identification
in the low momentum region. The goal is to achieve a time resolution of sigma ~
100 ps. We have conducted measurements using organic scintillators coupled to
Silicon Photomultipliers (SiPM). The results are encouraging such that we are
confident to reach the required time resolution.Comment: 10 pages, 7 figure
Observing biogeochemical cycles at global scales with profiling floats and gliders: prospects for a global array
Chemical and biological sensor technologies have advanced rapidly in the past five years. Sensors that require low power and operate for multiple years are now available for oxygen, nitrate, and a variety of bio-optical properties that serve as proxies for important components of the carbon cycle (e.g., particulate organic carbon). These sensors have all been deployed successfully for long periods, in some cases more than three years, on platforms such as profiling floats or gliders. Technologies for pH, pCO2, and particulate inorganic carbon are maturing rapidly as well. These sensors could serve as the enabling technology for a global biogeochemical observing system that might operate on a scale comparable to the current Argo array. Here, we review the scientific motivation and the prospects for a global observing system for ocean biogeochemistry
General duality for abelian-group-valued statistical-mechanics models
We introduce a general class of statistical-mechanics models, taking values
in an abelian group, which includes examples of both spin and gauge models,
both ordered and disordered. The model is described by a set of ``variables''
and a set of ``interactions''. A Gibbs factor is associated to each variable
and to each interaction. We introduce a duality transformation for systems in
this class. The duality exchanges the abelian group with its dual, the Gibbs
factors with their Fourier transforms, and the interactions with the variables.
High (low) couplings in the interaction terms are mapped into low (high)
couplings in the one-body terms. The idea is that our class of systems extends
the one for which the classical procedure 'a la Kramers and Wannier holds, up
to include randomness into the pattern of interaction. We introduce and study
some physical examples: a random Gaussian Model, a random Potts-like model, and
a random variant of discrete scalar QED. We shortly describe the consequence of
duality for each example.Comment: 26 pages, 2 Postscript figure
On the Second Law of thermodynamics and the piston problem
The piston problem is investigated in the case where the length of the
cylinder is infinite (on both sides) and the ratio is a very small
parameter, where is the mass of one particle of the gaz and is the mass
of the piston. Introducing initial conditions such that the stochastic motion
of the piston remains in the average at the origin (no drift), it is shown that
the time evolution of the fluids, analytically derived from Liouville equation,
agrees with the Second Law of thermodynamics.
We thus have a non equilibrium microscopical model whose evolution can be
explicitly shown to obey the two laws of thermodynamics.Comment: 29 pages, 9 figures submitted to Journal of Statistical Physics
(2003
Harmonizing Software Standards with a Semantic Model
The application of standards in the software development process supports interoperability between systems. Maintenance of standards must be guaranteed on the organisational and technical level. The use of semantic technologies can contribute to the standard maintenance process by providing a harmonizing bridge between standards of different knowledge domains and languages and by providing a single point of administration for standard domain concepts. This paper describes a case study of the creation of a semantic layer between software standards for water management systems in The Netherland
Phase Rotation, Cooling And Acceleration Of Muon Beams: A Comparison Of Different Approaches
Experimental and theoretical activities are underway at CERN with the aim of
examining the feasibility of a very-high-flux neutrino source. In the present
scheme, a high-power proton beam (some 4 MW) bombards a target where pions are
produced. The pions are collected and decay to muons under controlled optical
condition. The muons are cooled and accelerated to a final energy of 50 GeV
before being injected into a decay ring where they decay under well-defined
conditions of energy and emittance.
We present the most challenging parts of the whole scenario, the muon
capture, the ionisation-cooling and the first stage of the muon acceleration.
Different schemes, their performance and the technical challenges are compared.Comment: LINAC 2000 CONFERENCE, paper ID No. THC1
Expert chess memory: Revisiting the chunking hypothesis
After reviewing the relevant theory on chess expertise, this paper re-examines experimentally the finding of Chase and Simon (1973a) that the differences in ability of chess players at different skill levels to copy and to recall positions are attributable to the experts' storage of thousands of chunks (patterned clusters of pieces) in long-term memory. Despite important differences in the experimental apparatus, the data of the present experiments regarding latencies and chess relations between successively placed pieces are highly correlated with those of Chase and Simon. We conclude that the 2-second inter-chunk interval used to define chunk boundaries is robust, and that chunks have psychological reality. We discuss the possible reasons why Masters in our new study used substantially larger chunks than the Master of the 1973 study, and extend the chunking theory to take account of the evidence for large retrieval structures (templates) in long-term memory
Mod/Resc Parsimony Inference
We address in this paper a new computational biology problem that aims at
understanding a mechanism that could potentially be used to genetically
manipulate natural insect populations infected by inherited, intra-cellular
parasitic bacteria. In this problem, that we denote by \textsc{Mod/Resc
Parsimony Inference}, we are given a boolean matrix and the goal is to find two
other boolean matrices with a minimum number of columns such that an
appropriately defined operation on these matrices gives back the input. We show
that this is formally equivalent to the \textsc{Bipartite Biclique Edge Cover}
problem and derive some complexity results for our problem using this
equivalence. We provide a new, fixed-parameter tractability approach for
solving both that slightly improves upon a previously published algorithm for
the \textsc{Bipartite Biclique Edge Cover}. Finally, we present experimental
results where we applied some of our techniques to a real-life data set.Comment: 11 pages, 3 figure
- âŠ