1,593 research outputs found
Heat transfer simulation of evacuated tube collectors (ETC): An application to a prototype
Since fossil fuels shortages are predicted for the forthcoming generations, the use of renewable energy sources is playing a key role and is strongly recommended worldwide by national and international regulations. In this scenario, solar collectors for hot water preparation, space heating and cooling are becoming an increasingly interesting alternative, especially in the building sector because of population growth. Thus, the present paper is addressed to numerically investigate the thermal behaviour of a prototypal evacuated tube by solving the heat transfer differential equations using the Finite Element Method. This is to reproduce the heat transfer process occurring within the real system, helping the industry improve the prototype
Scaling in Non-stationary time series I
Most data processing techniques, applied to biomedical and sociological time
series, are only valid for random fluctuations that are stationary in time.
Unfortunately, these data are often non stationary and the use of techniques of
analysis resting on the stationary assumption can produce a wrong information
on the scaling, and so on the complexity of the process under study. Herein, we
test and compare two techniques for removing the non-stationary influences from
computer generated time series, consisting of the superposition of a slow
signal and a random fluctuation. The former is based on the method of wavelet
decomposition, and the latter is a proposal of this paper, denoted by us as
step detrending technique. We focus our attention on two cases, when the slow
signal is a periodic function mimicking the influence of seasons, and when it
is an aperiodic signal mimicking the influence of a population change (increase
or decrease). For the purpose of computational simplicity the random
fluctuation is taken to be uncorrelated. However, the detrending techniques
here illustrated work also in the case when the random component is correlated.
This expectation is fully confirmed by the sociological applications made in
the companion paper. We also illustrate a new procedure to assess the existence
of a genuine scaling, based on the adoption of diffusion entropy, multiscaling
analysis and the direct assessment of scaling. Using artificial sequences, we
show that the joint use of all these techniques yield the detection of the real
scaling, and that this is independent of the technique used to detrend the
original signal.Comment: 39 pages, 13 figure
Recommended from our members
Improving Visual Field Examination of the Macula Using Structural Information
Purpose: To investigate a novel approach for structure-function modeling in glaucoma to improve visual field testing in the macula.
Methods: We acquired data from the macular region in 20 healthy eyes and 31 with central glaucomatous damage. Optical coherence tomography (OCT) scans were used to estimate the local macular ganglion cell density. Perimetry was performed with a fundus-tracking device using a 10-2 grid. OCT scans were matched to the retinal image from the fundus perimeter to accurately map the tested locations onto the structural damage. Binary responses from the subjects to all presented stimuli were used to calculate the structure-function model used to generate prior distributions for a ZEST (Zippy Estimation by Sequential Testing) Bayesian strategy. We used simulations based on structural and functional data acquired from an independent dataset of 20 glaucoma patients to compare the performance of this new strategy, structural macular ZEST (MacS-ZEST), with a standard ZEST.
Results: Compared to the standard ZEST, MacS-ZEST reduced the number of presentations by 13% in reliable simulated subjects and 14% with higher rates (≥20%) of false positive or false negative errors. Reduction in mean absolute error was not present for reliable subjects but was gradually more important with unreliable responses (≥10% at 30% error rate).
Conclusions: Binary responses can be modeled to incorporate detailed structural information from macular OCT into visual field testing, improving overall speed and accuracy in poor responders.
Translational Relevance: Structural information can improve speed and reliability for macular testing in glaucoma practice
Facing Non-Stationary Conditions with a New Indicator of Entropy Increase: The Cassandra Algorithm
We address the problem of detecting non-stationary effects in time series (in
particular fractal time series) by means of the Diffusion Entropy Method (DEM).
This means that the experimental sequence under study, of size , is explored
with a window of size . The DEM makes a wise use of the statistical
information available and, consequently, in spite of the modest size of the
window used, does succeed in revealing local statistical properties, and it
shows how they change upon moving the windows along the experimental sequence.
The method is expected to work also to predict catastrophic events before their
occurrence.Comment: FRACTAL 2002 (Spain
Activity autocorrelation in financial markets. A comparative study between several models
We study the activity, i.e., the number of transactions per unit time, of
financial markets. Using the diffusion entropy technique we show that the
autocorrelation of the activity is caused by the presence of peaks whose time
distances are distributed following an asymptotic power law which ultimately
recovers the Poissonian behavior. We discuss these results in comparison with
ARCH models, stochastic volatility models and multi-agent models showing that
ARCH and stochastic volatility models better describe the observed experimental
evidences.Comment: 15 pages, 4 figure
L\'{e}vy scaling: the Diffusion Entropy Analysis applied to DNA sequences
We address the problem of the statistical analysis of a time series generated
by complex dynamics with a new method: the Diffusion Entropy Analysis (DEA)
(Fractals, {\bf 9}, 193 (2001)). This method is based on the evaluation of the
Shannon entropy of the diffusion process generated by the time series imagined
as a physical source of fluctuations, rather than on the measurement of the
variance of this diffusion process, as done with the traditional methods. We
compare the DEA to the traditional methods of scaling detection and we prove
that the DEA is the only method that always yields the correct scaling value,
if the scaling condition applies. Furthermore, DEA detects the real scaling of
a time series without requiring any form of de-trending. We show that the joint
use of DEA and variance method allows to assess whether a time series is
characterized by L\'{e}vy or Gauss statistics. We apply the DEA to the study of
DNA sequences, and we prove that their large-time scales are characterized by
L\'{e}vy statistics, regardless of whether they are coding or non-coding
sequences. We show that the DEA is a reliable technique and, at the same time,
we use it to confirm the validity of the dynamic approach to the DNA sequences,
proposed in earlier work.Comment: 24 pages, 9 figure
Memory beyond memory in heart beating: an efficient way to detect pathological conditions
We study the long-range correlations of heartbeat fluctuations with the
method of diffusion entropy. We show that this method of analysis yields a
scaling parameter that apparently conflicts with the direct evaluation
of the distribution of times of sojourn in states with a given heartbeat
frequency. The strength of the memory responsible for this discrepancy is given
by a parameter , which is derived from real data. The
distribution of patients in the (, )-plane yields a neat
separation of the healthy from the congestive heart failure subjects.Comment: submitted to Physical Review Letters, 5 figure
Survey of hyperfine structure measurements in alkali atoms
The spectroscopic hyperfine constants for all the alkali atoms are reported.
For atoms from lithium to cesium, only the long lived atomic isotopes are
examined. For francium, the measured data for nuclear ground states of all
available isotopes are listed. All results obtained since the beginning of
laser investigations are presented, while for previous works the data of
Arimondo {\it et. al.} Rev. Mod. Phys. 49, 31 (1977) are recalled. Global
analyses based on the scaling laws and on the hyperfine anomalies are
performed.Comment: 41 pages, 5 figure
Compression and diffusion: a joint approach to detect complexity
The adoption of the Kolmogorov-Sinai (KS) entropy is becoming a popular
research tool among physicists, especially when applied to a dynamical system
fitting the conditions of validity of the Pesin theorem. The study of time
series that are a manifestation of system dynamics whose rules are either
unknown or too complex for a mathematical treatment, is still a challenge since
the KS entropy is not computable, in general, in that case. Here we present a
plan of action based on the joint action of two procedures, both related to the
KS entropy, but compatible with computer implementation through fast and
efficient programs. The former procedure, called Compression Algorithm
Sensitive To Regularity (CASToRe), establishes the amount of order by the
numerical evaluation of algorithmic compressibility. The latter, called Complex
Analysis of Sequences via Scaling AND Randomness Assessment (CASSANDRA),
establishes the complexity degree through the numerical evaluation of the
strength of an anomalous effect. This is the departure, of the diffusion
process generated by the observed fluctuations, from ordinary Brownian motion.
The CASSANDRA algorithm shares with CASToRe a connection with the Kolmogorov
complexity. This makes both algorithms especially suitable to study the
transition from dynamics to thermodynamics, and the case of non-stationary time
series as well. The benefit of the joint action of these two methods is proven
by the analysis of artificial sequences with the same main properties as the
real time series to which the joint use of these two methods will be applied in
future research work.Comment: 27 pages, 9 figure
- …