835,455 research outputs found
Effect of limited statistics on higher order cumulants measurement in heavy-ion collision experiments
We have studied the effect of limited statistics of data on measurement of
the different order of cumulants of net-proton distribution assuming that the
proton and antiproton distributions follow Possionian and Binomial
distributions with initial parameters determined from experimental results for
two top center of mass energies ( and GeV)
in most central (%) AuAu collisions at Relativistic Heavy Ion Collider
(RHIC). In this simulation, we observe that the central values for higher order
cumulants have a strong dependence on event sample size and due to statistical
randomness the central values of higher order cumulants could become negative.
We also present a study on the determination of the statistical error on
cumulants using delta theorem, bootstrap and sub-group methods and verified
their suitability by employing a Monte Carlo procedure. Based on our study we
find that the bootstrap method provides a robust way for statistical error
estimation on high order cumulants. We also present the exclusion limits on the
minimum event statistics needed for determination of cumulants if the signal
strength (phase transition or critical point) is at a level of % and %
above the statistical level. This study will help the experiments to arrive at
the minimum required event statistics and choice of proper method for
statistical error estimation for high order cumulant measurements.Comment: 14 pages, 16 figure
Expansions of GMM statistics that indicate their properties under weak and/or many instruments and the bootstrap
We construct higher order expressions for Wald and Lagrange multiplier (LM) GMM statistics that are based on 2step and continuous updating estimators (CUE). We show that the sensitivity of the limit distribution to weak and many instruments results from superfluous elements in the higher order expansion. When the instruments are strong and their number is small, these elements are of higher order and result in higher order biases. When instruments are weak and/or their number is large, they are, however, of zero-th order and influence the limiting distributions. Edgeworth approximations do not remove the superfluous elements. The expansion of the LM-CUE statistic, which is Kleibergen's (2003) K-statistic, does not contain the superfluous higher order elements so it is robust to weak or many instruments. An Edgeworth approximation of its finite sample distribution shows that the bootstrap reduces the size distortion. We compute power curves for tests on the autocorrelation parameter in a panel autoregressive model to illustrate the consequences of the higher order.terms and the improvement that results from applying the bootstrapGMM, weak instruments, bootstrap, Panel AR(1)
A robust method for diagnosis of morphological arrhythmias based on Hermitian model of higher-order statistics
Abstract Background Electrocardiography (ECG) signal is a primary criterion for medical practitioners to diagnose heart diseases. The development of a reliable, accurate, non-invasive and robust method for arrhythmia detection could assists cardiologists in the study of patients with heart diseases. This paper provides a method for morphological heart arrhythmia detection which might have different shapes in one category and also different morphologies in relation to the patients. The distinctive property of this method in addition to accuracy is the robustness of that, in presence of Gaussian noise, time and amplitude shift. Methods In this work 2nd, 3rd and 4th order cumulants of the ECG beat are calculated and modeled by linear combinations of Hermitian basis functions. Then, the parameters of each cumulant model are used as feature vectors to classify five different ECG beats namely as Normal, PVC, APC, RBBB and LBBB using 1-Nearest Neighborhood (1-NN) classifier. Finally, after classifying each model, a final decision making rule apply to these specified classes and the type of ECG beat is defined. Results The experiment was applied for a set of ECG beats consist of 9367 samples in 5 different categories from MIT/BIH heart arrhythmia database. The specificity of 99.67% and the sensitivity of 98.66% in arrhythmia detection are achieved which indicates the power of the algorithm. Also, the accuracy of the system remained almost intact in the presence of Gaussian noise, time shift and amplitude shift of ECG signals. Conclusions This paper presents a novel and robust methodology in morphological heart arrhythmia detection. The methodology based on the Hermite model of the Higher-Order Statistics (HOS). The ability of HOS in suppressing morphological variations of different class-specific arrhythmias and also reducing the effects of Gaussian noise, made HOS, suitable for detection morphological heart arrhythmias. The proposed method exploits these properties in conjunction with Hermitian model to perform an efficient and reliable classification approach to detect five morphological heart arrhythmias. And the time consumption of this method for each beat is less than the period of a normal beat.</p
An improved cosmological parameter inference scheme motivated by deep learning
Dark matter cannot be observed directly, but its weak gravitational lensing
slightly distorts the apparent shapes of background galaxies, making weak
lensing one of the most promising probes of cosmology. Several observational
studies have measured the effect, and there are currently running, and planned
efforts to provide even larger, and higher resolution weak lensing maps. Due to
nonlinearities on small scales, the traditional analysis with two-point
statistics does not fully capture all the underlying information. Multiple
inference methods were proposed to extract more details based on higher order
statistics, peak statistics, Minkowski functionals and recently convolutional
neural networks (CNN). Here we present an improved convolutional neural network
that gives significantly better estimates of and
cosmological parameters from simulated convergence maps than the state of art
methods and also is free of systematic bias. We show that the network exploits
information in the gradients around peaks, and with this insight, we construct
a new, easy-to-understand, and robust peak counting algorithm based on the
'steepness' of peaks, instead of their heights. The proposed scheme is even
more accurate than the neural network on high-resolution noiseless maps. With
shape noise and lower resolution its relative advantage deteriorates, but it
remains more accurate than peak counting
Second order statistics based blind source separation for artifact correction of short ERP epochs
ERP is commonly obtained by averaging over segmented EEC epochs. In case artifacts are present in the raw EEC measurement, pre-processing is required to prevent the averaged ERP waveform being interfered by artifacts. The simplest pre-processing approach is by rejecting trials in which presence of artifact is detected. Alternatively artifact correction instead of rejection can be performed by blind source separation, so that waste of ERP trials is avoided. In this paper, we propose a second order statistics based blind source separation approach to ERP artifact correction. Comparing with blind separation using independent component analysis, second order statistics based method does not rely on higher order statistics or signal entropy, and therefore leads to more robust separation even if only short epochs are available.published_or_final_versio
Higher order feature extraction and selection for robust human gesture recognition using CSI of COTS Wi-Fi devices
Device-free human gesture recognition (HGR) using commercial o the shelf (COTS) Wi-Fi
devices has gained attention with recent advances in wireless technology. HGR recognizes the human
activity performed, by capturing the reflections ofWi-Fi signals from moving humans and storing
them as raw channel state information (CSI) traces. Existing work on HGR applies noise reduction
and transformation to pre-process the raw CSI traces. However, these methods fail to capture
the non-Gaussian information in the raw CSI data due to its limitation to deal with linear signal
representation alone. The proposed higher order statistics-based recognition (HOS-Re) model extracts
higher order statistical (HOS) features from raw CSI traces and selects a robust feature subset for the
recognition task. HOS-Re addresses the limitations in the existing methods, by extracting third order
cumulant features that maximizes the recognition accuracy. Subsequently, feature selection methods
derived from information theory construct a robust and highly informative feature subset, fed as
input to the multilevel support vector machine (SVM) classifier in order to measure the performance.
The proposed methodology is validated using a public database SignFi, consisting of 276 gestures
with 8280 gesture instances, out of which 5520 are from the laboratory and 2760 from the home
environment using a 10 5 cross-validation. HOS-Re achieved an average recognition accuracy of
97.84%, 98.26% and 96.34% for the lab, home and lab + home environment respectively. The average
recognition accuracy for 150 sign gestures with 7500 instances, collected from five di erent users was
96.23% in the laboratory environment.Taylor's University through its TAYLOR'S PhD SCHOLARSHIP Programmeinfo:eu-repo/semantics/publishedVersio
- ā¦