622 research outputs found
Scintillation and charge extraction from the tracks of energetic electrons in superfluid helium-4
An energetic electron passing through liquid helium causes ionization along
its track. The ionized electrons quickly recombine with the resulting positive
ions, which leads to the production of prompt scintillation light. By applying
appropriate electric fields, some of the ionized electrons can be separated
from their parent ions. The fraction of the ionized electrons extracted in a
given applied field depends on the separation distance between the electrons
and the ions. We report the determination of the mean electron-ion separation
distance for charge pairs produced along the tracks of beta particles in
superfluid helium at 1.5 K by studying the quenching of the scintillation light
under applied electric fields. Knowledge of this mean separation parameter will
aid in the design of particle detectors that use superfluid helium as a target
material.Comment: 10 pages, 8 figure
Determination of the high-twist contribution to the structure function
We extract the high-twist contribution to the neutrino-nucleon structure
function from the analysis of the data collected by
the IHEP-JINR Neutrino Detector in the runs with the focused neutrino beams at
the IHEP 70 GeV proton synchrotron. The analysis is performed within the
infrared renormalon (IRR) model of high twists in order to extract the
normalization parameter of the model. From the NLO QCD fit to our data we
obtained the value of the IRR model normalization parameter
. We
also obtained from a similar fit to the CCFR data. The average of both results is
.Comment: preprint IHEP-01-18, 7 pages, LATEX, 1 figure (EPS
A Bayesian Nonparametric Regression Model With Normalized Weights - A Study of Hippocampal Atrophy in Alzheimer’s Disease
Hippocampal volume is one of the best established biomarkers for Alzheimer’s disease. However, for appropriate use in clinical trials research, the evolution of hippocampal volume needs to be well understood. Recent theoretical models propose a sigmoidal pattern for its evolution. To support this theory, the use of Bayesian nonparametric regression mixture models seems particularly suitable due to the flexibility that models of this type can achieve and the unsatisfactory predictive properties of semiparametric methods. In this article, our aim is to develop an interpretable Bayesian nonparametric regression model which allows inference with combinations of both continuous and discrete covariates, as required for a full analysis of the dataset. Simple arguments regarding the interpretation of Bayesian nonparametric regression mixtures lead naturally to regression weights based on normalized sums. Difficulty in working with the intractable normalizing constant is overcome thanks to recent advances in MCMC methods and the development of a novel auxiliary variable scheme. We apply the new model and MCMC method to study the dynamics of hippocampal volume, and our results provide statistical evidence in support of the theoretical hypothesis
Control of Rayleigh-Taylor instability by vertical vibration in large aspect ratio containers
We consider a horizontal heavy fluid layer supported by a light, immiscible one in a wide (as compared to depth) container, which is vertically vibrated intending to counterbalance the Rayleigh-Taylor instability of the flat, rigid-body vibrating state. In the simplest case when the density and viscosity of the lighter fluid are small compared to their counterparts in the heavier fluid, we apply a long wave, weakly nonlinear analysis that yields a generalized Cahn-Hilliard equation for the evolution of the fluid interface. This equation shows that the stabilizing effect of vibration is like that of surface tension, and is used to analyze the linear stability of the flat state, the local bifurcation at the instability threshold and some global existence and stability properties concerning the steady states without dry spots. The analysis is extended to two cases of practical interest. Namely, (a) the viscosity of one of the fluids is much smaller than that of the other one, and (b) the densities and viscosities of both fluids are quite close to each other
Governance of microfinance institutions (MFIs) in Cameroon: What lessons can we learn?
The aim of this paper is to find out the effects of the COBAC regulations regulating the microfinance industry on the governance of microfinance institutions (MFIs) in Cameroon. The paper is based on 35 in-depth interviews carried out from May to June 2011 and June to July 2012 with managers and accountants from MFIs in Cameroon, MFI clients and non-clients, regulatory authorities in the Ministry of Finance, and accounting professionals. The findings show that the regulations have broken down the governance within the MFIs in Cameroon thus turning MFIs into hybrid organizations with managers striving to meet their shareholders' interests
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
Formation of dense partonic matter in relativistic nucleus-nucleus collisions at RHIC: Experimental evaluation by the PHENIX collaboration
Extensive experimental data from high-energy nucleus-nucleus collisions were
recorded using the PHENIX detector at the Relativistic Heavy Ion Collider
(RHIC). The comprehensive set of measurements from the first three years of
RHIC operation includes charged particle multiplicities, transverse energy,
yield ratios and spectra of identified hadrons in a wide range of transverse
momenta (p_T), elliptic flow, two-particle correlations, non-statistical
fluctuations, and suppression of particle production at high p_T. The results
are examined with an emphasis on implications for the formation of a new state
of dense matter. We find that the state of matter created at RHIC cannot be
described in terms of ordinary color neutral hadrons.Comment: 510 authors, 127 pages text, 56 figures, 1 tables, LaTeX. Submitted
to Nuclear Physics A as a regular article; v3 has minor changes in response
to referee comments. Plain text data tables for the points plotted in figures
for this and previous PHENIX publications are (or will be) publicly available
at http://www.phenix.bnl.gov/papers.htm
Evidence of Color Coherence Effects in W+jets Events from ppbar Collisions at sqrt(s) = 1.8 TeV
We report the results of a study of color coherence effects in ppbar
collisions based on data collected by the D0 detector during the 1994-1995 run
of the Fermilab Tevatron Collider, at a center of mass energy sqrt(s) = 1.8
TeV. Initial-to-final state color interference effects are studied by examining
particle distribution patterns in events with a W boson and at least one jet.
The data are compared to Monte Carlo simulations with different color coherence
implementations and to an analytic modified-leading-logarithm perturbative
calculation based on the local parton-hadron duality hypothesis.Comment: 13 pages, 6 figures. Submitted to Physics Letters
Measurement of the p-pbar -> Wgamma + X cross section at sqrt(s) = 1.96 TeV and WWgamma anomalous coupling limits
The WWgamma triple gauge boson coupling parameters are studied using p-pbar
-> l nu gamma + X (l = e,mu) events at sqrt(s) = 1.96 TeV. The data were
collected with the DO detector from an integrated luminosity of 162 pb^{-1}
delivered by the Fermilab Tevatron Collider. The cross section times branching
fraction for p-pbar -> W(gamma) + X -> l nu gamma + X with E_T^{gamma} > 8 GeV
and Delta R_{l gamma} > 0.7 is 14.8 +/- 1.6 (stat) +/- 1.0 (syst) +/- 1.0 (lum)
pb. The one-dimensional 95% confidence level limits on anomalous couplings are
-0.88 < Delta kappa_{gamma} < 0.96 and -0.20 < lambda_{gamma} < 0.20.Comment: Submitted to Phys. Rev. D Rapid Communication
Measurement of the ttbar Production Cross Section in ppbar Collisions at sqrt{s} = 1.96 TeV using Kinematic Characteristics of Lepton + Jets Events
We present a measurement of the top quark pair ttbar production cross section
in ppbar collisions at a center-of-mass energy of 1.96 TeV using 230 pb**{-1}
of data collected by the DO detector at the Fermilab Tevatron Collider. We
select events with one charged lepton (electron or muon), large missing
transverse energy, and at least four jets, and extract the ttbar content of the
sample based on the kinematic characteristics of the events. For a top quark
mass of 175 GeV, we measure sigma(ttbar) = 6.7 {+1.4-1.3} (stat) {+1.6- 1.1}
(syst) +/-0.4 (lumi) pb, in good agreement with the standard model prediction.Comment: submitted to Phys.Rev.Let
- …