2,007 research outputs found
Predicting the size and probability of epidemics in a population with heterogeneous infectiousness and susceptibility
We analytically address disease outbreaks in large, random networks with
heterogeneous infectivity and susceptibility. The transmissibility
(the probability that infection of causes infection of ) depends on the
infectivity of and the susceptibility of . Initially a single node is
infected, following which a large-scale epidemic may or may not occur. We use a
generating function approach to study how heterogeneity affects the probability
that an epidemic occurs and, if one occurs, its attack rate (the fraction
infected). For fixed average transmissibility, we find upper and lower bounds
on these. An epidemic is most likely if infectivity is homogeneous and least
likely if the variance of infectivity is maximized. Similarly, the attack rate
is largest if susceptibility is homogeneous and smallest if the variance is
maximized. We further show that heterogeneity in infectious period is
important, contrary to assumptions of previous studies. We confirm our
theoretical predictions by simulation. Our results have implications for
control strategy design and identification of populations at higher risk from
an epidemic.Comment: 5 pages, 3 figures. Submitted to Physical Review Letter
Effectiveness of a social support intervention on infant feeding practices : randomised controlled trial
Background: To assess whether monthly home visits from trained volunteers could improve infant feeding practices at age 12 months, a randomised controlled trial was carried out in two disadvantaged inner city London boroughs.
Methods: Women attending baby clinics with their infants (312) were randomised to receive monthly home visits from trained volunteers over a 9-month period (intervention group) or standard professional care only (control group). The primary outcome was vitamin C intakes from fruit. Secondary outcomes included selected macro and micro-nutrients, infant feeding habits, supine length and weight. Data were collected at baseline when infants were aged approximately 10 weeks, and subsequently when the child was 12 and 18 months old.
Results: Two-hundred and twelve women (68%) completed the trial. At both follow-up points no significant differences were found between the groups for vitamin C intakes from fruit or other nutrients. At first follow-up, however, infants in the intervention group were significantly less likely to be given goats’ or soya milks, and were more likely to have three solid meals per day. At the second follow-up, intervention group children were significantly less likely to be still using a bottle. At both follow-up points, intervention group children also consumed significantly more specific fruit and vegetables.
Conclusions: Home visits from trained volunteers had no significant effect on nutrient intakes but did promote some other recommended infant feeding practices
Heterogeneity of cell membrane structure studied by single molecule tracking
Heterogeneity in cell membrane structure, typified by microdomains with different biophysical and biochemical properties, is thought to impact on a variety of cell functions. Integral membrane proteins act as nanometre-sized probes of the lipid environment and their thermally-driven movements can be used to report local variations in membrane properties. In the current study, we have used total internal reflection fluorescence microscopy (TIRFM) combined with super-resolution tracking of multiple individual molecules, in order to create high-resolution maps of local membrane viscosity. We used a quadrat sampling method and show how statistical tests for membrane heterogeneity can be conducted by analysing the paths of many molecules that pass through the same unit area of membrane. We describe experiments performed on cultured primary cells, stable cell lines and ex vivo tissue slices using a variety of membrane proteins, under different imaging conditions. In some cell types, we find no evidence for heterogeneity in mobility across the plasma membrane, but in others we find statistically significant differences with some regions of membrane showing significantly higher viscosity than others
Analysis of Petri Net Models through Stochastic Differential Equations
It is well known, mainly because of the work of Kurtz, that density dependent
Markov chains can be approximated by sets of ordinary differential equations
(ODEs) when their indexing parameter grows very large. This approximation
cannot capture the stochastic nature of the process and, consequently, it can
provide an erroneous view of the behavior of the Markov chain if the indexing
parameter is not sufficiently high. Important phenomena that cannot be revealed
include non-negligible variance and bi-modal population distributions. A
less-known approximation proposed by Kurtz applies stochastic differential
equations (SDEs) and provides information about the stochastic nature of the
process. In this paper we apply and extend this diffusion approximation to
study stochastic Petri nets. We identify a class of nets whose underlying
stochastic process is a density dependent Markov chain whose indexing parameter
is a multiplicative constant which identifies the population level expressed by
the initial marking and we provide means to automatically construct the
associated set of SDEs. Since the diffusion approximation of Kurtz considers
the process only up to the time when it first exits an open interval, we extend
the approximation by a machinery that mimics the behavior of the Markov chain
at the boundary and allows thus to apply the approach to a wider set of
problems. The resulting process is of the jump-diffusion type. We illustrate by
examples that the jump-diffusion approximation which extends to bounded domains
can be much more informative than that based on ODEs as it can provide accurate
quantity distributions even when they are multi-modal and even for relatively
small population levels. Moreover, we show that the method is faster than
simulating the original Markov chain
Random hypergraphs and their applications
In the last few years we have witnessed the emergence, primarily in on-line
communities, of new types of social networks that require for their
representation more complex graph structures than have been employed in the
past. One example is the folksonomy, a tripartite structure of users,
resources, and tags -- labels collaboratively applied by the users to the
resources in order to impart meaningful structure on an otherwise
undifferentiated database. Here we propose a mathematical model of such
tripartite structures which represents them as random hypergraphs. We show that
it is possible to calculate many properties of this model exactly in the limit
of large network size and we compare the results against observations of a real
folksonomy, that of the on-line photography web site Flickr. We show that in
some cases the model matches the properties of the observed network well, while
in others there are significant differences, which we find to be attributable
to the practice of multiple tagging, i.e., the application by a single user of
many tags to one resource, or one tag to many resources.Comment: 11 pages, 7 figure
Subgraphs in random networks
Understanding the subgraph distribution in random networks is important for
modelling complex systems. In classic Erdos networks, which exhibit a
Poissonian degree distribution, the number of appearances of a subgraph G with
n nodes and g edges scales with network size as \mean{G} ~ N^{n-g}. However,
many natural networks have a non-Poissonian degree distribution. Here we
present approximate equations for the average number of subgraphs in an
ensemble of random sparse directed networks, characterized by an arbitrary
degree sequence. We find new scaling rules for the commonly occurring case of
directed scale-free networks, in which the outgoing degree distribution scales
as P(k) ~ k^{-\gamma}. Considering the power exponent of the degree
distribution, \gamma, as a control parameter, we show that random networks
exhibit transitions between three regimes. In each regime the subgraph number
of appearances follows a different scaling law, \mean{G} ~ N^{\alpha}, where
\alpha=n-g+s-1 for \gamma<2, \alpha=n-g+s+1-\gamma for 2<\gamma<\gamma_c, and
\alpha=n-g for \gamma>\gamma_c, s is the maximal outdegree in the subgraph, and
\gamma_c=s+1. We find that certain subgraphs appear much more frequently than
in Erdos networks. These results are in very good agreement with numerical
simulations. This has implications for detecting network motifs, subgraphs that
occur in natural networks significantly more than in their randomized
counterparts.Comment: 8 pages, 5 figure
Tort, Truth Recovery and the Northern Ireland Conflict
Northern Ireland has no effective process to address legacy of the human tragedy of decades of conflict. And yet during that conflict, and especially in the years since the Belfast/Good Friday Agreement 1998, people have employed multiple legal mechanisms to gain information about events which affected them and their loved ones. Human rights challenges, public inquiries, freedom of information requests, police investigations and fresh inquests have all contributed to a patchwork of approaches to truth recovery. The UK Government has long viewed these efforts with suspicion; as the primary state actor involved in the conflict its records provide a much richer source of information about historic wrongs than the recollections of members of clandestine paramilitary organisations. Successive Conservative administrations have characterised many of these efforts as “lawfare”, intended to persecute veterans long after the events in question and undermine public faith in the UK’s Armed Forces. One under-explored element of this complex picture is use of tort in legacy cases. Civil actions, supported by legal aid funding in Northern Ireland, provide a potential avenue for the discovery of information held by public bodies. Even unsuccessful actions can thus contribute new information about the events in question. Many of the harms inflicted during the conflict were torts as well as crimes, and this article assesses the extent to which these civil actions provide an ersatz mechanism for truth recovery, and challenges efforts to curtail such actions as a “witch-hunt”
Randomised controlled trial of ranitidine versus omeprazole in combination with antibiotics for eradication of Helicobacter pylori.
This study compared high dose ranitidine versus low dose omeprazole with antibiotics for the eradication of H pylori. 80 patients (mean age 48 years, range 18-75) who had H pylori infection were randomised in an investigator-blind manner to either a two-week regime of omeprazole 20 mg daily, amoxycillin 500 mg tid and metronidazole 400 mg tid (OAM), or ranitidine 600 mg bd, amoxycillin 500 mg tid and metronidazole 400 mg tid (RAM), or omeprazole 20 mg daily and clarithromycin 500 mg tid (OC), or omeprazole 20 mg daily and placebo (OP). H pylori was eradicated in 6 of 19 patients in the OAM group (32%); 8 of 18 in the RAM group (44%), 4 of 15 in the OC group (27%); none of 18 in the OP group (0%). [< P0.005 for OAM, RAM, OC vs OP; P = N.S. between OAM, RAM, OC]. Overall metronidazole resistance was unexpectedly high at 58%. Eradication rates in metronidazole sensitive patients were 71% (5/7) and 100% (3/3) for OAM and RAM respectively. In conclusion, H pylori eradication rates using high dose ranitidine plus amoxycillin and metronidazole may be similar to that of low dose omeprazole in combination with the same antibiotics for omeprazole with clarithromycin. Overall eradication rates were low due to a high incidence of metronidazole resistance but were higher in metronidazole-sensitive patients. Even high dose ranitidine with two antibiotics achieves a relatively low eradication rate. These metronidazole-based regimens cannot be recommended in areas with a high incidence of metronidazole resistance
Synchronization in Weighted Uncorrelated Complex Networks in a Noisy Environment: Optimization and Connections with Transport Efficiency
Motivated by synchronization problems in noisy environments, we study the
Edwards-Wilkinson process on weighted uncorrelated scale-free networks. We
consider a specific form of the weights, where the strength (and the associated
cost) of a link is proportional to with and
being the degrees of the nodes connected by the link. Subject to the
constraint that the total network cost is fixed, we find that in the mean-field
approximation on uncorrelated scale-free graphs, synchronization is optimal at
-1. Numerical results, based on exact numerical diagonalization
of the corresponding network Laplacian, confirm the mean-field results, with
small corrections to the optimal value of . Employing our recent
connections between the Edwards-Wilkinson process and resistor networks, and
some well-known connections between random walks and resistor networks, we also
pursue a naturally related problem of optimizing performance in queue-limited
communication networks utilizing local weighted routing schemes.Comment: Papers on related research can be found at
http://www.rpi.edu/~korniss/Research
Diffusion-annihilation processes in complex networks
We present a detailed analytical study of the
diffusion-annihilation process in complex networks. By means of microscopic
arguments, we derive a set of rate equations for the density of particles
in vertices of a given degree, valid for any generic degree distribution, and
which we solve for uncorrelated networks. For homogeneous networks (with
bounded fluctuations), we recover the standard mean-field solution, i.e. a
particle density decreasing as the inverse of time. For heterogeneous
(scale-free networks) in the infinite network size limit, we obtain instead a
density decreasing as a power-law, with an exponent depending on the degree
distribution. We also analyze the role of finite size effects, showing that any
finite scale-free network leads to the mean-field behavior, with a prefactor
depending on the network size. We check our analytical predictions with
extensive numerical simulations on homogeneous networks with Poisson degree
distribution and scale-free networks with different degree exponents.Comment: 9 pages, 5 EPS figure
- …