2,937 research outputs found
Facing Non-Stationary Conditions with a New Indicator of Entropy Increase: The Cassandra Algorithm
We address the problem of detecting non-stationary effects in time series (in
particular fractal time series) by means of the Diffusion Entropy Method (DEM).
This means that the experimental sequence under study, of size , is explored
with a window of size . The DEM makes a wise use of the statistical
information available and, consequently, in spite of the modest size of the
window used, does succeed in revealing local statistical properties, and it
shows how they change upon moving the windows along the experimental sequence.
The method is expected to work also to predict catastrophic events before their
occurrence.Comment: FRACTAL 2002 (Spain
State Aggregation-based Model of Asynchronous Multi-Fiber Optical Switching with Shared Wavelength Converters
Cataloged from PDF version of article.This paper proposes new analytical models to study optical packet switching architectures
with multi-fiber interfaces and shared wavelength converters. The multi-fiber
extension of the recently proposed Shared-Per-Input-Wavelength (SPIW) scheme is
compared against the multi-fiber Shared-Per-Node (SPN) scheme in terms of cost and
performance for asynchronous traffic. In addition to using Markov chains and fixed-point
iterations for modeling the mono-fiber case, a novel state aggregation technique is
proposed to evaluate the packet loss in asynchronous multi-fiber scenario. The accuracy
of the performance models is validated by comparison with simulations in a wide variety
of scenarios with both balanced and imbalanced input traffic. The proposed analytical
models are shown to remarkably capture the actual system behavior in all scenarios we
tested. The adoption of multi-fiber interfaces is shown to achieve remarkable savings in
the number of wavelength converters employed and their range. In addition, the SPIW
solution allows to save, in particular conditions, a significant number of optical gates
compared to the SPN solution. Indeed, SPIW allows, if properly dimensioned, potential
complexity and cost reduction compared to SPN, while providing similar performance.
(C) 2013 Elsevier B.V. All rights reserved
Compression and diffusion: a joint approach to detect complexity
The adoption of the Kolmogorov-Sinai (KS) entropy is becoming a popular
research tool among physicists, especially when applied to a dynamical system
fitting the conditions of validity of the Pesin theorem. The study of time
series that are a manifestation of system dynamics whose rules are either
unknown or too complex for a mathematical treatment, is still a challenge since
the KS entropy is not computable, in general, in that case. Here we present a
plan of action based on the joint action of two procedures, both related to the
KS entropy, but compatible with computer implementation through fast and
efficient programs. The former procedure, called Compression Algorithm
Sensitive To Regularity (CASToRe), establishes the amount of order by the
numerical evaluation of algorithmic compressibility. The latter, called Complex
Analysis of Sequences via Scaling AND Randomness Assessment (CASSANDRA),
establishes the complexity degree through the numerical evaluation of the
strength of an anomalous effect. This is the departure, of the diffusion
process generated by the observed fluctuations, from ordinary Brownian motion.
The CASSANDRA algorithm shares with CASToRe a connection with the Kolmogorov
complexity. This makes both algorithms especially suitable to study the
transition from dynamics to thermodynamics, and the case of non-stationary time
series as well. The benefit of the joint action of these two methods is proven
by the analysis of artificial sequences with the same main properties as the
real time series to which the joint use of these two methods will be applied in
future research work.Comment: 27 pages, 9 figure
Guidelines for physical weed control research: flame weeding, weed harrowing and intra-row cultivation
A prerequisite for good research is the use of appropriate methodology. In order to aggregate sound research methodology, this paper presents some tentative guidelines for physical weed control research in general, and flame weeding, weed harrowing and intra-row cultivation in particular. Issues include the adjustment and use of mechanical weeders and other equipment, the recording of impact factors that affect weeding performance, methods to assess effectiveness, the layout of treatment plots, and the conceptual models underlying the experimental designs (e.g. factorial comparison, dose response).
First of all, the research aims need to be clearly defined, an appropriate experimental design produced and statistical methods chosen accordingly. Suggestions on how to do this are given. For assessments, quantitative measures would be ideal, but as they require more resources, visual classification may in some cases be more feasible. The timing of assessment affects the results and their interpretation.
When describing the weeds and crops, one should list the crops and the most abundantly present weed species involved, giving their density and growth stages at the time of treatment. The location of the experimental field, soil type, soil moisture and amount of fertilization should be given, as well as weather conditions at the time of treatment.
The researcher should describe the weed control equipment and adjustments accurately, preferably according to the prevailing practice within the discipline. Things to record are e.g. gas pressure, burner properties, burner cover dimensions and LPG consumption in flame weeding; speed, angle of tines, number of passes and direction in weed harrowing.
The authors hope this paper will increase comparability among experiments, help less experienced scientists to prevent mistakes and essential omissions, and foster the advance of knowledge on non-chemical weed management
Commissioning of the MEG II tracker system
The MEG experiment at the Paul Scherrer Institut (PSI) represents the state
of the art in the search for the charged Lepton Flavour Violating (cLFV) decay. With the phase 1, MEG set the new world best
upper limit on the \mbox{BR}(\mu^+ \rightarrow e^+ \gamma) < 4.2 \times
10^{-13} (90% C.L.). With the phase 2, MEG II, the experiment aims at reaching
a sensitivity enhancement of about one order of magnitude compared to the
previous MEG result. The new Cylindrical Drift CHamber (CDCH) is a key detector
for MEG II. CDCH is a low-mass single volume detector with high granularity: 9
layers of 192 drift cells, few mm wide, defined by wires in a
stereo configuration for longitudinal hit localization. The filling gas mixture
is Helium:Isobutane (90:10). The total radiation length is
\mbox{X}_0, thus minimizing the Multiple Coulomb Scattering (MCS)
contribution and allowing for a single-hit resolution m and an
angular and momentum resolutions of 6 mrad and 90 keV/c respectively. This
article presents the CDCH commissioning activities at PSI after the wiring
phase at INFN Lecce and the assembly phase at INFN Pisa. The endcaps
preparation, HV tests and conditioning of the chamber are described, aiming at
reaching the final stable working point. The integration into the MEG II
experimental apparatus is described, in view of the first data taking with
cosmic rays and beam during the 2018 and 2019 engineering runs. The
first gas gain results are also shown. A full engineering run with all the
upgraded detectors and the complete DAQ electronics is expected to start in
2020, followed by three years of physics data taking.Comment: 10 pages, 12 figures, 1 table, proceeding at INSTR'20 conference,
accepted for publication in JINS
Memory beyond memory in heart beating: an efficient way to detect pathological conditions
We study the long-range correlations of heartbeat fluctuations with the
method of diffusion entropy. We show that this method of analysis yields a
scaling parameter that apparently conflicts with the direct evaluation
of the distribution of times of sojourn in states with a given heartbeat
frequency. The strength of the memory responsible for this discrepancy is given
by a parameter , which is derived from real data. The
distribution of patients in the (, )-plane yields a neat
separation of the healthy from the congestive heart failure subjects.Comment: submitted to Physical Review Letters, 5 figure
Displacement power spectrum measurement of a macroscopic optomechanical system at thermal equilibrium
The mirror relative motion of a suspended Fabry-Perot cavity is studied in
the frequency range 3-10 Hz. The experimental measurements presented in this
paper, have been performed at the Low Frequency Facility, a high finesse
optical cavity 1 cm long suspended to a mechanical seismic isolation system
identical to that one used in the VIRGO experiment. The measured relative
displacement power spectrum is compatible with a system at thermal equilibrium
within its environmental. In the frequency region above 3 Hz, where seismic
noise contamination is negligible, the measurement distribution is stationary
and Gaussian, as expected for a system at thermal equilibrium. Through a simple
mechanical model it is shown that: applying the fluctuation dissipation theorem
the measured power spectrum is reproduced below 90 Hz and noise induced by
external sources are below the measurement.Comment: 11 pages, 9 figures, 2 tables, to be submitte
Design, status and perspective of the Mu2e crystal calorimeter
The Mu2e experiment at Fermilab will search for the charged lepton flavor
violating process of neutrino-less coherent conversion in the field
of an aluminum nucleus. Mu2e will reach a single event sensitivity of about
that corresponds to four orders of magnitude improvements
with respect to the current best limit. The detector system consists of a straw
tube tracker and a crystal calorimeter made of undoped CsI coupled with Silicon
Photomultipliers. The calorimeter was designed to be operable in a harsh
environment where about 10 krad/year will be delivered in the hottest region
and work in presence of 1 T magnetic field. The calorimeter role is to perform
/e separation to suppress cosmic muons mimiking the signal, while
providing a high level trigger and a seeding the track search in the tracker.
In this paper we present the calorimeter design and the latest RD results.Comment: 4 pages, conference proceeding for a presentation held at TIPP'2017.
To be published on Springer Proceedings in Physic
Quality Assurance on a custom SiPMs array for the Mu2e experiment
The Mu2e experiment at Fermilab will search for the coherent
conversion on aluminum atoms. The detector system consists of a straw tube
tracker and a crystal calorimeter. A pre-production of 150 Silicon
Photomultiplier arrays for the Mu2e calorimeter has been procured. A detailed
quality assur- ance has been carried out on each SiPM for the determination of
its own operation voltage, gain, dark current and PDE. The measurement of the
mean-time-to-failure for a small random sample of the pro-production group has
been also completed as well as the determination of the dark current increase
as a function of the ioninizing and non-ioninizing dose.Comment: 4 pages, 10 figures, conference proceeding for NSS-MIC 201
Come back Marshall, all is forgiven? : Complexity, evolution, mathematics and Marshallian exceptionalism
Marshall was the great synthesiser of neoclassical economics. Yet with his qualified assumption of self-interest, his emphasis on variation in economic evolution and his cautious attitude to the use of mathematics, Marshall differs fundamentally from other leading neoclassical contemporaries. Metaphors inspire more specific analogies and ontological assumptions, and Marshall used the guiding metaphor of Spencerian evolution. But unfortunately, the further development of a Marshallian evolutionary approach was undermined in part by theoretical problems within Spencer's theory. Yet some things can be salvaged from the Marshallian evolutionary vision. They may even be placed in a more viable Darwinian framework.Peer reviewedFinal Accepted Versio
- …
