1,212 research outputs found
Final results of Borexino Phase-I on low energy solar neutrino spectroscopy
Borexino has been running since May 2007 at the LNGS with the primary goal of
detecting solar neutrinos. The detector, a large, unsegmented liquid
scintillator calorimeter characterized by unprecedented low levels of intrinsic
radioactivity, is optimized for the study of the lower energy part of the
spectrum. During the Phase-I (2007-2010) Borexino first detected and then
precisely measured the flux of the 7Be solar neutrinos, ruled out any
significant day-night asymmetry of their interaction rate, made the first
direct observation of the pep neutrinos, and set the tightest upper limit on
the flux of CNO neutrinos. In this paper we discuss the signal signature and
provide a comprehensive description of the backgrounds, quantify their event
rates, describe the methods for their identification, selection or subtraction,
and describe data analysis. Key features are an extensive in situ calibration
program using radioactive sources, the detailed modeling of the detector
response, the ability to define an innermost fiducial volume with extremely low
background via software cuts, and the excellent pulse-shape discrimination
capability of the scintillator that allows particle identification. We report a
measurement of the annual modulation of the 7 Be neutrino interaction rate. The
period, the amplitude, and the phase of the observed modulation are consistent
with the solar origin of these events, and the absence of their annual
modulation is rejected with higher than 99% C.L. The physics implications of
phase-I results in the context of the neutrino oscillation physics and solar
models are presented
Event impact analysis for time series
Time series arise in a variety of application domains—whenever data points are recorded over time and stored for subsequent analysis. A critical question is whether the occurrence of events like natural disasters, technical faults, or political interventions leads to changes in a time series, for example, temporary deviations from its typical behavior. The vast majority of existing research on this topic focuses on the specific impact of a single event on a time series, while methods to generically capture the impact of a recurring event are scarce. In this thesis, we fill this gap by introducing a novel framework for event impact analysis in the case of randomly recurring events. We develop a statistical perspective on the problem and provide a generic notion of event impacts based on a statistical independence relation. The main problem we address is that of establishing the presence of event impacts in stationary time series using statistical independence tests. Tests for event impacts should be generic, powerful, and computationally efficient. We develop two algorithmic test strategies for event impacts that satisfy these properties. The first is based on coincidences between events and peaks in the time series, while the second is based on multiple marginal associations. We also discuss a selection of follow-up questions, including ways to measure, model and visualize event impacts, and the relationship between event impact analysis and anomaly detection in time series. At last, we provide a first method to study event impacts in nonstationary time series. We evaluate our methodological contributions on several real-world datasets and study their performance within large-scale simulation studies
Detection of dependence patterns with delay
The Unitary Events (UE) method is a popular and efficient method used this
last decade to detect dependence patterns of joint spike activity among
simultaneously recorded neurons. The first introduced method is based on binned
coincidence count \citep{Grun1996} and can be applied on two or more
simultaneously recorded neurons. Among the improvements of the methods, a
transposition to the continuous framework has recently been proposed in
\citep{muino2014frequent} and fully investigated in \citep{MTGAUE} for two
neurons. The goal of the present paper is to extend this study to more than two
neurons. The main result is the determination of the limit distribution of the
coincidence count. This leads to the construction of an independence test
between neurons. Finally we propose a multiple test procedure via a
Benjamini and Hochberg approach \citep{Benjamini1995}. All the theoretical
results are illustrated by a simulation study, and compared to the UE method
proposed in \citep{Grun2002}. Furthermore our method is applied on real data
Detector-Agnostic Phase-Space Distributions
The representation of quantum states via phase-space functions constitutes an
intuitive technique to characterize light. However, the reconstruction of such
distributions is challenging as it demands specific types of detectors and
detailed models thereof to account for their particular properties and
imperfections. To overcome these obstacles, we derive and implement a
measurement scheme that enables a reconstruction of phase-space distributions
for arbitrary states whose functionality does not depend on the knowledge of
the detectors, thus defining the notion of detector-agnostic phase-space
distributions. Our theory presents a generalization of well-known phase-space
quasiprobability distributions, such as the Wigner function. We implement our
measurement protocol, using state-of-the-art transition-edge sensors without
performing a detector characterization. Based on our approach, we reveal the
characteristic features of heralded single- and two-photon states in phase
space and certify their nonclassicality with high statistical significance
Three-dimensional track reconstruction for directional Dark Matter detection
Directional detection of Dark Matter is a promising search strategy. However,
to perform such detection, a given set of parameters has to be retrieved from
the recoiling tracks : direction, sense and position in the detector volume. In
order to optimize the track reconstruction and to fully exploit the data of
forthcoming directional detectors, we present a likelihood method dedicated to
3D track reconstruction. This new analysis method is applied to the MIMAC
detector. It requires a full simulation of track measurements in order to
compare real tracks to simulated ones. We conclude that a good spatial
resolution can be achieved, i.e. sub-mm in the anode plane and cm along the
drift axis. This opens the possibility to perform a fiducialization of
directional detectors. The angular resolution is shown to range between
20 to 80, depending on the recoil energy, which is however
enough to achieve a high significance discovery of Dark Matter. On the
contrary, we show that sense recognition capability of directional detectors
depends strongly on the recoil energy and the drift distance, with small
efficiency values (50%-70%). We suggest not to consider this information either
for exclusion or discovery of Dark Matter for recoils below 100 keV and then to
focus on axial directional data.Comment: 27 pages, 20 figure
Non-stationary Variance and Volatility Causality
This paper aims to describe bias estimates when non-stationary variance is not detected. We first present a theoretical multivariate GARCH model with structural changes in variance. Then we describe the non-stationary variance and Volatility Causality in the case of the US and the three developed Asian stock markets Japan, Hong Kong and Singapore. Daily data are used for the period May 30th 2002 until June 29th 2010.Multivariate GARCH, Non linear VAR, Mean spillover, Volatility spillover, Structural break in variance, Market Co-movement.
Estimating the mutual information between two discrete, asymmetric variables with limited samples
Determining the strength of nonlinear, statistical dependencies between two variables is a crucial matter in many research fields. The established measure for quantifying such relations is the mutual information. However, estimating mutual information from limited samples is a challenging task. Since the mutual information is the difference of two entropies, the existing Bayesian estimators of entropy may be used to estimate information. This procedure, however, is still biased in the severely under-sampled regime. Here, we propose an alternative estimator that is applicable to those cases in which the marginal distribution of one of the two variables?the one with minimal entropy?is well sampled. The other variable, as well as the joint and conditional distributions, can be severely undersampled. We obtain a consistent estimator that presents very low bias, outperforming previous methods even when the sampled data contain few coincidences. As with other Bayesian estimators, our proposal focuses on the strength of the interaction between the two variables, without seeking to model the specific way in which they are related. A distinctive property of our method is that the main data statistics determining the amount of mutual information is the inhomogeneity of the conditional distribution of the low-entropy variable in those states in which the large-entropy variable registers coincidences.Fil: Hernández Lahme, Damián Gabriel. Comisión Nacional de Energía Atómica. Gerencia del Área de Investigación y Aplicaciones No Nucleares. Gerencia de Física (Centro Atómico Bariloche); Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Samengo, Ines. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Comisión Nacional de Energía Atómica. Gerencia del Área de Investigación y Aplicaciones No Nucleares. Gerencia de Física (Centro Atómico Bariloche); Argentin
A Semiparametric Bayesian Model for Detecting Synchrony Among Multiple Neurons
We propose a scalable semiparametric Bayesian model to capture dependencies
among multiple neurons by detecting their co-firing (possibly with some lag
time) patterns over time. After discretizing time so there is at most one spike
at each interval, the resulting sequence of 1's (spike) and 0's (silence) for
each neuron is modeled using the logistic function of a continuous latent
variable with a Gaussian process prior. For multiple neurons, the corresponding
marginal distributions are coupled to their joint probability distribution
using a parametric copula model. The advantages of our approach are as follows:
the nonparametric component (i.e., the Gaussian process model) provides a
flexible framework for modeling the underlying firing rates; the parametric
component (i.e., the copula model) allows us to make inference regarding both
contemporaneous and lagged relationships among neurons; using the copula model,
we construct multivariate probabilistic models by separating the modeling of
univariate marginal distributions from the modeling of dependence structure
among variables; our method is easy to implement using a computationally
efficient sampling algorithm that can be easily extended to high dimensional
problems. Using simulated data, we show that our approach could correctly
capture temporal dependencies in firing rates and identify synchronous neurons.
We also apply our model to spike train data obtained from prefrontal cortical
areas in rat's brain
Multi-resolution two-sample comparison through the divide-merge Markov tree
We introduce a probabilistic framework for two-sample comparison based on a
nonparametric process taking the form of a Markov model that transitions
between a "divide" and a "merge" state on a multi-resolution partition tree of
the sample space. Multi-scale two-sample comparison is achieved through
inferring the underlying state of the process along the partition tree. The
Markov design allows the process to incorporate spatial clustering of
differential structures, which is commonly observed in two-sample problems but
ignored by existing methods. Inference is carried out under the Bayesian
paradigm through recursive propagation algorithms. We demonstrate the work of
our method through simulated data and a real flow cytometry data set, and show
that it substantially outperforms other state-of-the-art two-sample tests in
several settings.Comment: Corrected typos. Added Software sectio
- …