436 research outputs found
Statistical Agent Based Modelization of the Phenomenon of Drug Abuse
We introduce a statistical agent based model to describe the phenomenon of
drug abuse and its dynamical evolution at the individual and global level. The
agents are heterogeneous with respect to their intrinsic inclination to drugs,
to their budget attitude and social environment. The various levels of drug use
were inspired by the professional description of the phenomenon and this
permits a direct comparison with all available data. We show that certain
elements have a great importance to start the use of drugs, for example the
rare events in the personal experiences which permit to overcame the barrier of
drug use occasionally. The analysis of how the system reacts to perturbations
is very important to understand its key elements and it provides strategies for
effective policy making. The present model represents the first step of a
realistic description of this phenomenon and can be easily generalized in
various directions.Comment: 12 pages, 5 figure
Statistical Properties of Height of Japanese Schoolchildren
We study height distributions of Japanese schoolchildren based on the
statictical data which are obtained from the school health survey by the
ministry of education, culture, sports, science and technology, Japan . From
our analysis, it has been clarified that the distribution of height changes
from the lognormal distribution to the normal distribution in the periods of
puberty.Comment: 2 pages, 2 figures, submitted to J. Phys. Soc. Jpn.; resubmitted to
J. Phys. Soc. Jpn. after some revisio
Carrier-envelope offset stable, coherently combined ytterbium-doped fiber CPA delivering 1 kW of average power
We present a carrier-envelope offset (CEO) stable ytterbium-doped fiber chirped-pulse amplification system employing the technology of coherent beam combining and delivering more than 1 kW of average power at a pulse repetition rate of 80 MHz. The CEO stability of the system is 220 mrad rms, characterized out-of-loop with an f -to-2f interferometer in a frequency offset range of 10 Hz to 20 MHz. The high-power amplification system boosts the average power of the CEO stable oscillator by five orders of magnitude while increasing the phase noise by only 100 mrad. No evidence of CEO noise deterioration due to coherent beam combining is found. Low-frequency CEO fluctuations at the chirped-pulse amplifier are suppressed by a “slow loop” feedback. To the best of our knowledge, this is the first demonstration of a coherently combined laser system delivering an outstanding average power and high CEO stability at the same time. © 2020 Optical Society of Americ
Problems with Using the Normal Distribution – and Ways to Improve Quality and Efficiency of Data Analysis
Background: The Gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by x 6 SD, or with the standard error of the mean, x 6 SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. Methodology/Principal Findings: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the ‘‘95 % range check’’, their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log-) normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x /, times-divide, and notation. Analogous to x 6 SD, it connects the multiplicative (or geometric) mean x * and the multiplicative standard deviation s * in the form x * x /s*, that is advantageous and recommended. Conclusions/Significance: The corresponding shift from the symmetric to the asymmetric view will substantially increas
Tiling array data analysis: a multiscale approach using wavelets
<p>Abstract</p> <p>Background</p> <p>Tiling array data is hard to interpret due to noise. The wavelet transformation is a widely used technique in signal processing for elucidating the true signal from noisy data. Consequently, we attempted to denoise representative tiling array datasets for ChIP-chip experiments using wavelets. In doing this, we used specific wavelet basis functions, <it>Coiflets</it>, since their triangular shape closely resembles the expected profiles of true ChIP-chip peaks.</p> <p>Results</p> <p>In our wavelet-transformed data, we observed that noise tends to be confined to small scales while the useful signal-of-interest spans multiple large scales. We were also able to show that wavelet coefficients due to non-specific cross-hybridization follow a log-normal distribution, and we used this fact in developing a thresholding procedure. In particular, wavelets allow one to set an unambiguous, absolute threshold, which has been hard to define in ChIP-chip experiments. One can set this threshold by requiring a similar confidence level at different length-scales of the transformed signal. We applied our algorithm to a number of representative ChIP-chip data sets, including those of Pol II and histone modifications, which have a diverse distribution of length-scales of biochemical activity, including some broad peaks.</p> <p>Conclusions</p> <p>Finally, we benchmarked our method in comparison to other approaches for scoring ChIP-chip data using spike-ins on the ENCODE Nimblegen tiling array. This comparison demonstrated excellent performance, with wavelets getting the best overall score.</p
Ground states of two-dimensional J Edwards-Anderson spin glasses
We present an exact algorithm for finding all the ground states of the
two-dimensional Edwards-Anderson spin glass and characterize its
performance. We investigate how the ground states change with increasing system
size and and with increasing antiferromagnetic bond ratio . We find that
that some system properties have very large and strongly non-Gaussian
variations between realizations.Comment: 15 pages, 21 figures, 2 tables, uses revtex4 macro
Neuronal Variability during Handwriting: Lognormal Distribution
We examined time-dependent statistical properties of electromyographic (EMG) signals recorded from intrinsic hand muscles during handwriting. Our analysis showed that trial-to-trial neuronal variability of EMG signals is well described by the lognormal distribution clearly distinguished from the Gaussian (normal) distribution. This finding indicates that EMG formation cannot be described by a conventional model where the signal is normally distributed because it is composed by summation of many random sources. We found that the variability of temporal parameters of handwriting - handwriting duration and response time - is also well described by a lognormal distribution. Although, the exact mechanism of lognormal statistics remains an open question, the results obtained should significantly impact experimental research, theoretical modeling and bioengineering applications of motor networks. In particular, our results suggest that accounting for lognormal distribution of EMGs can improve biomimetic systems that strive to reproduce EMG signals in artificial actuators
- …