24,903 research outputs found
A study of data coding technology developments in the 1980-1985 time frame, volume 2
The source parameters of digitized analog data are discussed. Different data compression schemes are outlined and analysis of their implementation are presented. Finally, bandwidth compression techniques are given for video signals
On-the-fly memory compression for multibody algorithms.
Memory and bandwidth demands challenge developers of particle-based codes that have to scale on new architectures, as the growth of concurrency outperforms improvements in memory access facilities, as the memory per core tends to stagnate, and as communication networks cannot increase bandwidth arbitrary. We propose to analyse each particle of such a code to find out whether a hierarchical data representation storing data with reduced precision caps the memory demands without exceeding given error bounds. For admissible candidates, we perform this compression and thus reduce the pressure on the memory subsystem, lower the total memory footprint and reduce the data to be exchanged via MPI. Notably, our analysis and transformation changes the data compression dynamically, i.e. the choice of data format follows the solution characteristics, and it does not require us to alter the core simulation code
Status and Future Perspectives for Lattice Gauge Theory Calculations to the Exascale and Beyond
In this and a set of companion whitepapers, the USQCD Collaboration lays out
a program of science and computing for lattice gauge theory. These whitepapers
describe how calculation using lattice QCD (and other gauge theories) can aid
the interpretation of ongoing and upcoming experiments in particle and nuclear
physics, as well as inspire new ones.Comment: 44 pages. 1 of USQCD whitepapers
Roadmap on optical security
Postprint (author's final draft
Compressed Genotyping
Significant volumes of knowledge have been accumulated in recent years
linking subtle genetic variations to a wide variety of medical disorders from
Cystic Fibrosis to mental retardation. Nevertheless, there are still great
challenges in applying this knowledge routinely in the clinic, largely due to
the relatively tedious and expensive process of DNA sequencing. Since the
genetic polymorphisms that underlie these disorders are relatively rare in the
human population, the presence or absence of a disease-linked polymorphism can
be thought of as a sparse signal. Using methods and ideas from compressed
sensing and group testing, we have developed a cost-effective genotyping
protocol. In particular, we have adapted our scheme to a recently developed
class of high throughput DNA sequencing technologies, and assembled a
mathematical framework that has some important distinctions from 'traditional'
compressed sensing ideas in order to address different biological and technical
constraints.Comment: Submitted to IEEE Transaction on Information Theory - Special Issue
on Molecular Biology and Neuroscienc
Measuring the Effects of Artificial Viscosity in SPH Simulations of Rotating Fluid Flows
A commonly cited drawback of SPH is the introduction of spurious shear
viscosity by the artificial viscosity term in situations involving rotation.
Existing approaches for quantifying its effect include approximate analytic
formulae and disc-averaged be- haviour in specific ring-spreading simulations,
based on the kinematic effects produced by the artificial viscosity. These
methods have disadvantages, in that they typically are applicable to a very
small range of physical scenarios, have a large number of simplifying
assumptions, and often are tied to specific SPH formulations which do not
include corrective (e.g., Balsara) or time-dependent viscosity terms. In this
study we have developed a simple, generally applicable and practical technique
for evaluating the local effect of artificial viscosity directly from the
creation of specific entropy for each SPH particle. This local approach is
simple and quick to implement, and it al- lows a detailed characterization of
viscous effects as a function of position. Several advantages of this method
are discussed, including its ease in evaluation, its greater accuracy and its
broad applicability. In order to compare this new method with ex- isting ones,
simple disc flow examples are used. Even in these basic cases, the very roughly
approximate nature of the previous methods is shown. Our local method pro-
vides a detailed description of the effects of the artificial viscosity
throughout the disc, even for extended examples which implement Balsara
corrections. As a further use of this approach, explicit dependencies of the
effective viscosity in terms of SPH and flow parameters are estimated from the
example cases. In an appendix, a method for the initial placement of SPH
particles is discussed which is very effective in reducing numerical
fluctuations.Comment: 15 pages, 9 figures, resubmitted to MNRA
- …