15,347 research outputs found
Selecting fast folding proteins by their rate of convergence
We propose a general method for predicting potentially good folders from a
given number of amino acid sequences. Our approach is based on the calculation
of the rate of convergence of each amino acid chain towards the native
structure using only the very initial parts of the dynamical trajectories. It
does not require any preliminary knowledge of the native state and can be
applied to different kinds of models, including atomistic descriptions. We
tested the method within both the lattice and off-lattice model frameworks and
obtained several so far unknown good folders. The unbiased algorithm also
allows to determine the optimal folding temperature and takes at least 3--4
orders of magnitude less time steps than those needed to compute folding times
Eigenvalue Separation in Some Random Matrix Models
The eigenvalue density for members of the Gaussian orthogonal and unitary
ensembles follows the Wigner semi-circle law. If the Gaussian entries are all
shifted by a constant amount c/Sqrt(2N), where N is the size of the matrix, in
the large N limit a single eigenvalue will separate from the support of the
Wigner semi-circle provided c > 1. In this study, using an asymptotic analysis
of the secular equation for the eigenvalue condition, we compare this effect to
analogous effects occurring in general variance Wishart matrices and matrices
from the shifted mean chiral ensemble. We undertake an analogous comparative
study of eigenvalue separation properties when the size of the matrices are
fixed and c goes to infinity, and higher rank analogues of this setting. This
is done using exact expressions for eigenvalue probability densities in terms
of generalized hypergeometric functions, and using the interpretation of the
latter as a Green function in the Dyson Brownian motion model. For the shifted
mean Gaussian unitary ensemble and its analogues an alternative approach is to
use exact expressions for the correlation functions in terms of classical
orthogonal polynomials and associated multiple generalizations. By using these
exact expressions to compute and plot the eigenvalue density, illustrations of
the various eigenvalue separation effects are obtained.Comment: 25 pages, 9 figures include
Heteroclinic Chaos, Chaotic Itinerancy and Neutral Attractors in Symmetrical Replicator Equations with Mutations
A replicator equation with mutation processes is numerically studied.
Without any mutations, two characteristics of the replicator dynamics are
known: an exponential divergence of the dominance period, and hierarchical
orderings of the attractors. A mutation introduces some new aspects: the
emergence of structurally stable attractors, and chaotic itinerant behavior. In
addition, it is reported that a neutral attractor can exist in the mutataion
rate -> +0 region.Comment: 4 pages, 9 figure
Segregation by thermal diffusion in granular shear flows
Segregation by thermal diffusion of an intruder immersed in a sheared
granular gas is analyzed from the (inelastic) Boltzmann equation. Segregation
is induced by the presence of a temperature gradient orthogonal to the shear
flow plane and parallel to gravity. We show that, like in analogous systems
without shear, the segregation criterion yields a transition between upwards
segregation and downwards segregation. The form of the phase diagrams is
illustrated in detail showing that they depend sensitively on the value of
gravity relative to the thermal gradient. Two specific situations are
considered: i) absence of gravity, and ii) homogeneous temperature. We find
that both mechanisms (upwards and downwards segregation) are stronger and more
clearly separated when compared with segregation criteria in systems without
shear.Comment: 8 figures. To appear in J. Stat. Mec
Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness
<b>Background</b> In this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding - and sometimes preventing - disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.<p></p>
<b>Discussion</b> As the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.<p></p>
<b>Summary</b> Burden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts
Measurement of transparency ratios for protons from short-range correlated pairs
Nuclear transparency, Tp(A), is a measure of the average probability for a
struck proton to escape the nucleus without significant re-interaction.
Previously, nuclear transparencies were extructed for quasi-elastic A(e,e'p)
knockout of protons with momentum below the Fermi momentum, where the spectral
functions are well known. In this paper we extract a novel observable, the
transparency ratio, Tp(A)/T_p(12C), for knockout of high-missing-momentum
protons from the breakup of short range correlated pairs (2N-SRC) in Al, Fe and
Pb nuclei relative to C. The ratios were measured at momentum transfer Q^2 >
1.5 (GeV/c)^2 and x_B > 1.2 where the reaction is expected to be dominated by
electron scattering from 2N-SRC. The transparency ratios of the knocked-out
protons coming from 2N-SRC breakup are 20 - 30% lower than those of previous
results for low missing momentum. They agree with Glauber calculations and
agree with renormalization of the previously published transparencies as
proposed by recent theoretical investigations. The new transparencies scale as
A^-1/3, which is consistent with dominance of scattering from nucleons at the
nuclear surface.Comment: 6 pages, 4 figure
Statistics of Certain Models of Evolution
In a recent paper, Newman surveys the literature on power law spectra in
evolution, self-organised criticality and presents a model of his own to arrive
at a conclusion that self-organised criticality is not necessary for evolution.
Not only did he miss a key model (Ecolab) that has a clear self-organised
critical mechanism, but also Newman's model exhibits the same mechanism that
gives rise to power law behaviour as does Ecolab. Newman's model is, in fact, a
``mean field'' approximation of a self-organised critical system. In this
paper, I have also implemented Newman's model using the Ecolab software,
removing the restriction that the number of species remains constant. It turns
out that the requirement of constant species number is non-trivial, leading to
a global coupling between species that is similar in effect to the species
interactions seen in Ecolab. In fact, the model must self-organise to a state
where the long time average of speciations balances that of the extinctions,
otherwise the system either collapses or explodes. In view of this, Newman's
model does not provide the hoped-for counter example to the presence of
self-organised criticality in evolution, but does provide a simple, almost
analytic model that can used to understand more intricate models such as
Ecolab.Comment: accepted in Phys Rev E.; RevTeX; See
http://parallel.hpc.unsw.edu.au/rks/ecolab.html for more informatio
Assessing and countering reaction attacks against post-quantum public-key cryptosystems based on QC-LDPC codes
Code-based public-key cryptosystems based on QC-LDPC and QC-MDPC codes are
promising post-quantum candidates to replace quantum vulnerable classical
alternatives. However, a new type of attacks based on Bob's reactions have
recently been introduced and appear to significantly reduce the length of the
life of any keypair used in these systems. In this paper we estimate the
complexity of all known reaction attacks against QC-LDPC and QC-MDPC code-based
variants of the McEliece cryptosystem. We also show how the structure of the
secret key and, in particular, the secret code rate affect the complexity of
these attacks. It follows from our results that QC-LDPC code-based systems can
indeed withstand reaction attacks, on condition that some specific decoding
algorithms are used and the secret code has a sufficiently high rate.Comment: 21 pages, 2 figures, to be presented at CANS 201
- …