216,884 research outputs found
Parallel Implementation of Lossy Data Compression for Temporal Data Sets
Many scientific data sets contain temporal dimensions. These are the data
storing information at the same spatial location but different time stamps.
Some of the biggest temporal datasets are produced by parallel computing
applications such as simulations of climate change and fluid dynamics. Temporal
datasets can be very large and cost a huge amount of time to transfer among
storage locations. Using data compression techniques, files can be transferred
faster and save storage space. NUMARCK is a lossy data compression algorithm
for temporal data sets that can learn emerging distributions of element-wise
change ratios along the temporal dimension and encodes them into an index table
to be concisely represented. This paper presents a parallel implementation of
NUMARCK. Evaluated with six data sets obtained from climate and astrophysics
simulations, parallel NUMARCK achieved scalable speedups of up to 8788 when
running 12800 MPI processes on a parallel computer. We also compare the
compression ratios against two lossy data compression algorithms, ISABELA and
ZFP. The results show that NUMARCK achieved higher compression ratio than
ISABELA and ZFP.Comment: 10 pages, HiPC 201
An Information-Theoretic Test for Dependence with an Application to the Temporal Structure of Stock Returns
Information theory provides ideas for conceptualising information and
measuring relationships between objects. It has found wide application in the
sciences, but economics and finance have made surprisingly little use of it. We
show that time series data can usefully be studied as information -- by noting
the relationship between statistical redundancy and dependence, we are able to
use the results of information theory to construct a test for joint dependence
of random variables. The test is in the same spirit of those developed by
Ryabko and Astola (2005, 2006b,a), but differs from these in that we add extra
randomness to the original stochatic process. It uses data compression to
estimate the entropy rate of a stochastic process, which allows it to measure
dependence among sets of random variables, as opposed to the existing
econometric literature that uses entropy and finds itself restricted to
pairwise tests of dependence. We show how serial dependence may be detected in
S&P500 and PSI20 stock returns over different sample periods and frequencies.
We apply the test to synthetic data to judge its ability to recover known
temporal dependence structures.Comment: 22 pages, 7 figure
Active Virtual Network Management Prediction: Complexity as a Framework for Prediction, Optimization, and Assurance
Research into active networking has provided the incentive to re-visit what
has traditionally been classified as distinct properties and characteristics of
information transfer such as protocol versus service; at a more fundamental
level this paper considers the blending of computation and communication by
means of complexity. The specific service examined in this paper is network
self-prediction enabled by Active Virtual Network Management Prediction.
Computation/communication is analyzed via Kolmogorov Complexity. The result is
a mechanism to understand and improve the performance of active networking and
Active Virtual Network Management Prediction in particular. The Active Virtual
Network Management Prediction mechanism allows information, in various states
of algorithmic and static form, to be transported in the service of prediction
for network management. The results are generally applicable to algorithmic
transmission of information. Kolmogorov Complexity is used and experimentally
validated as a theory describing the relationship among algorithmic
compression, complexity, and prediction accuracy within an active network.
Finally, the paper concludes with a complexity-based framework for Information
Assurance that attempts to take a holistic view of vulnerability analysis
Recommended from our members
Quantifying Viscoelastic Properties of Nylon-6,6 Actuators
Orthostatic Hypotension (OH) is a prevalent condition affecting 52.1% of stroke patients, characterized by a drop in atrial blood pressure upon standing. This is due to the pooling of blood in the abdomen and leg, and OH results in debilitating symptoms of nausea, lightheadedness, and dizziness. Current management techniques are limited and not very effective, so I propose an active, compression abdominal band that contracts when needed. This device should have a minimal design, so nylon-6,6 actuators, powerful artificial muscles created from fishing line and conductive thread, were chosen as the compressive element. Given that previous research focused on strength and force-excursion characteristics of these actuators, this study focuses on determining their viability for this application by conducting stress-relaxation and creep tests on single actuators (sample sizes of 10) and on a forty actuator band. Stress-relaxation results indicate that actuators will be able to maintain tension levels required effective compression 18 times as long as necessary. Creep testing is inconclusive due to oscillations found in the data as a result of low processing power of the Instron machine used to conduct tests. Despite the fact that more tests need to be conducted to resolve various limitations of this study,I conclude that I can move forward to create a prototype of the abdominal band.Biomedical Engineerin
Recommended from our members
A mission synthesis algorithm for fatigue damage analysis
This paper presents a signal processing based algorithm, the Mildly Nonstationary Mission Synthesis
(MNMS), which produces a short mission signal from long records of experimental data. The
algorithm uses the Discrete Fourier Transform, Orthogonal Wavelet Transform and bump reinsertion
procedures. In order to observe the algorithm effectiveness a fatigue damage case study was
performed for a vehicle lower suspension arm using signals containing tensile and compressive
preloading. The mission synthesis results were compared to the original road data in terms of both the
global signal statistics and the fatigue damage variation as a function of compression ratio. Three
bump reinsertion methods were used and evaluated. The methods differed in the manner in which
bumps (shock events) from different wavelet groups (frequency bands) were synchronised during the
reinsertion process. One method, based on time synchronised section reinsertion, produced the best
results in terms of mission signal kurtosis, crest factor, root-mean-square level and power spectral
density. For improved algorithm performance, bump selection was identified as the main control
parameter requiring optimisation
- …