56,919 research outputs found
Application of Information Theory in Nuclear Liquid Gas Phase Transition
Information entropy and Zipf's law in the field of information theory have
been used for studying the disassembly of nuclei in the framework of the
isospin dependent lattice gas model and molecular dynamical model. We found
that the information entropy in the event space is maximum at the phase
transition point and the mass of the cluster show exactly inversely to its
rank, i.e. Zipf's law appears. Both novel criteria are useful in searching the
nuclear liquid gas phase transition experimentally and theoretically.Comment: 5 pages, 5 figure
Angoricity and compactivity describe the jamming transition in soft particulate matter
The application of concepts from equilibrium statistical mechanics to
out-of-equilibrium systems has a long history of describing diverse systems
ranging from glasses to granular materials. For dissipative jammed systems--
particulate grains or droplets-- a key concept is to replace the energy
ensemble describing conservative systems by the volume-stress ensemble. Here,
we test the applicability of the volume-stress ensemble to describe the jamming
transition by comparing the jammed configurations obtained by dynamics with
those averaged over the ensemble as a probe of ergodicity. Agreement between
both methods suggests the idea of "thermalization" at a given angoricity and
compactivity. We elucidate the thermodynamic order of the jamming transition by
showing the absence of critical fluctuations in static observables like
pressure and volume. The approach allows to calculate observables such as the
entropy, volume, pressure, coordination number and distribution of forces to
characterize the scaling laws near the jamming transition from a statistical
mechanics viewpoint.Comment: 27 pages, 13 figure
Entropy-scaling search of massive biological data
Many datasets exhibit a well-defined structure that can be exploited to
design faster search tools, but it is not always clear when such acceleration
is possible. Here, we introduce a framework for similarity search based on
characterizing a dataset's entropy and fractal dimension. We prove that
searching scales in time with metric entropy (number of covering hyperspheres),
if the fractal dimension of the dataset is low, and scales in space with the
sum of metric entropy and information-theoretic entropy (randomness of the
data). Using these ideas, we present accelerated versions of standard tools,
with no loss in specificity and little loss in sensitivity, for use in three
domains---high-throughput drug screening (Ammolite, 150x speedup), metagenomics
(MICA, 3.5x speedup of DIAMOND [3,700x BLASTX]), and protein structure search
(esFragBag, 10x speedup of FragBag). Our framework can be used to achieve
"compressive omics," and the general theory can be readily applied to data
science problems outside of biology.Comment: Including supplement: 41 pages, 6 figures, 4 tables, 1 bo
Edwards thermodynamics of the jamming transition for frictionless packings: ergodicity test and role of angoricity and compactivity
This paper illustrates how the tools of equilibrium statistical mechanics can
help to explain a far-from-equilibrium problem: the jamming transition in
frictionless granular materials. Edwards ideas consist of proposing a
statistical ensemble of volume and stress fluctuations through the
thermodynamic notion of entropy, compactivity, X, and angoricity, A (two
temperature-like variables). We find that Edwards thermodynamics is able to
describe the jamming transition (J-point). Using the ensemble formalism we
elucidate the following: (i)We test the combined volume-stress ensemble by
comparing the statistical properties of jammed configurations obtained by
dynamics with those averaged over the ensemble of minima in the potential
energy landscape as a test of ergodicity. Agreement between both methods
supports the idea of "thermalization" at a given angoricity and compactivity.
(ii) A microcanonical ensemble analysis supports the idea of maximum entropy
principle for grains. (iii) The intensive variables describe the approach to
jamming through a series of scaling relations as A {\to} 0+ and X {\to} 0-. Due
to the force-volume coupling, the jamming transition can be probed
thermodynamically by a "jamming temperature" TJ comprised of contributions from
A and X. (iv) The thermodynamic framework reveals the order of the jamming
phase transition by showing the absence of critical fluctuations at jamming in
observables like pressure and volume. (v) Finally, we elaborate on a comparison
with relevant studies showing a breakdown of equiprobability of microstates.Comment: 22pages, 24 figure
Multi-Qubit Systems: Highly Entangled States and Entanglement Distribution
A comparison is made of various searching procedures, based upon different
entanglement measures or entanglement indicators, for highly entangled
multi-qubits states. In particular, our present results are compared with those
recently reported by Brown et al. [J. Phys. A: Math. Gen. 38 (2005) 1119]. The
statistical distribution of entanglement values for the aforementioned
multi-qubit systems is also explored.Comment: 24 pages, 3 figure
Universality of Entanglement and Quantum Computation Complexity
We study the universality of scaling of entanglement in Shor's factoring
algorithm and in adiabatic quantum algorithms across a quantum phase transition
for both the NP-complete Exact Cover problem as well as the Grover's problem.
The analytic result for Shor's algorithm shows a linear scaling of the entropy
in terms of the number of qubits, therefore difficulting the possibility of an
efficient classical simulation protocol. A similar result is obtained
numerically for the quantum adiabatic evolution Exact Cover algorithm, which
also shows universality of the quantum phase transition the system evolves
nearby. On the other hand, entanglement in Grover's adiabatic algorithm remains
a bounded quantity even at the critical point. A classification of scaling of
entanglement appears as a natural grading of the computational complexity of
simulating quantum phase transitions.Comment: 30 pages, 17 figures, accepted for publication in PR
Multi-Scale CLEAN deconvolution of radio synthesis images
Radio synthesis imaging is dependent upon deconvolution algorithms to
counteract the sparse sampling of the Fourier plane. These deconvolution
algorithms find an estimate of the true sky brightness from the necessarily
incomplete sampled visibility data. The most widely used radio synthesis
deconvolution method is the CLEAN algorithm of Hogbom. This algorithm works
extremely well for collections of point sources and surprisingly well for
extended objects. However, the performance for extended objects can be improved
by adopting a multi-scale approach. We describe and demonstrate a conceptually
simple and algorithmically straightforward extension to CLEAN that models the
sky brightness by the summation of components of emission having different size
scales. While previous multiscale algorithms work sequentially on decreasing
scale sizes, our algorithm works simultaneously on a range of specified scales.
Applications to both real and simulated data sets are given.Comment: Submitted to IEEE Special Issue on Signal Processin
- …