1,502 research outputs found
Distributed Approximation of Minimum Routing Cost Trees
We study the NP-hard problem of approximating a Minimum Routing Cost Spanning
Tree in the message passing model with limited bandwidth (CONGEST model). In
this problem one tries to find a spanning tree of a graph over nodes
that minimizes the sum of distances between all pairs of nodes. In the
considered model every node can transmit a different (but short) message to
each of its neighbors in each synchronous round. We provide a randomized
-approximation with runtime for
unweighted graphs. Here, is the diameter of . This improves over both,
the (expected) approximation factor and the runtime
of the best previously known algorithm.
Due to stating our results in a very general way, we also derive an (optimal)
runtime of when considering -approximations as done by the
best previously known algorithm. In addition we derive a deterministic
-approximation
Dynamical Scaling: the Two-Dimensional XY Model Following a Quench
To sensitively test scaling in the 2D XY model quenched from
high-temperatures into the ordered phase, we study the difference between
measured correlations and the (scaling) results of a Gaussian-closure
approximation. We also directly compare various length-scales. All of our
results are consistent with dynamical scaling and an asymptotic growth law , though with a time-scale that depends on the
length-scale in question. We then reconstruct correlations from the
minimal-energy configuration consistent with the vortex positions, and find
them significantly different from the ``natural'' correlations --- though both
scale with . This indicates that both topological (vortex) and
non-topological (``spin-wave'') contributions to correlations are relevant
arbitrarily late after the quench. We also present a consistent definition of
dynamical scaling applicable more generally, and emphasize how to generalize
our approach to other quenched systems where dynamical scaling is in question.
Our approach directly applies to planar liquid-crystal systems.Comment: 10 pages, 10 figure
Scale invariance in coarsening of binary and ternary fluids
Phase separation in binary and ternary fluids is studied using a two
dimensional Lattice Gas Automata. The lengths, given by the the first zero
crossing point of the correlation function and the total interface length is
shown to exhibit power law dependence on time. In binary mixtures, our data
clearly indicate the existence of a regime having more than one length scale
where the coarsening process proceeds through the rupture and reassociation of
domains. In ternary fluids; in the case of symmetric mixtures there exists a
regime with a single length scale having dynamic exponent 1/2, while in
asymmetric mixtures our data establish the break down of scale invariance.Comment: 20 pages, 13 figure
Associations between self-reported and objective face recognition abilities are only evident in above- and below-average recognisers
The 20-Item Prosopagnosia Items (PI-20) was recently introduced as a self-report
measure of face recognition abilities and as an instrument to help the diagnosis of
prosopagnosia. In general, studies using this questionnaire have shown that observers
have moderate to strong insights into their face recognition abilities. However, it
remains unknown whether these insights are equivalent for the whole range of
face recognition abilities. The present study investigates this issue using the
Mandarin version of the PI-20 and the Cambridge Face Memory Test Chinese
(CFMT-Chinese). Our results showed a moderate negative association between the
PI-20 and the CFMT-Chinese. However, this association was driven by people
with low and high face recognition ability, but absent in people within the typical
range of face recognition performance. The implications of these results for the study
of individual differences and the diagnosis of prosopagnosia are discussed
Scaling and Crossover in the Large-N Model for Growth Kinetics
The dependence of the scaling properties of the structure factor on space
dimensionality, range of interaction, initial and final conditions, presence or
absence of a conservation law is analysed in the framework of the large-N model
for growth kinetics. The variety of asymptotic behaviours is quite rich,
including standard scaling, multiscaling and a mixture of the two. The
different scaling properties obtained as the parameters are varied are
controlled by a structure of fixed points with their domains of attraction.
Crossovers arising from the competition between distinct fixed points are
explicitely obtained. Temperature fluctuations below the critical temperature
are not found to be irrelevant when the order parameter is conserved. The model
is solved by integration of the equation of motion for the structure factor and
by a renormalization group approach.Comment: 48 pages with 6 figures available upon request, plain LaTe
Phase resolution limit in macroscopic interference between Bose-Einstein condensates
We study the competition between phase definition and quantum phase
fluctuations in interference experiments between independently formed Bose
condensates. While phase-sensitive detection of atoms makes the phase
progressively better defined, interactions tend to randomize it faster as the
uncertainty in the relative particle number grows. A steady state is reached
when the two effects cancel each other. Then the phase resolution saturates to
a value that grows with the ratio between the interaction strength and the atom
detection rate, and the average phase and number begin to fluctuate
classically. We discuss how our study applies to both recently performed and
possible future experiments.Comment: 4 pages, 5 figure
Development of a Clinical Type 1 Diabetes Metabolic System Model and in Silico Simulation Tool
Invited journal symposium paperObjectives:
To develop a safe and effective protocol for the clinical control of Type 1 diabetes using conventional self-monitoring blood glucose (SMBG) measurements, and multiple daily injection (MDI) with insulin analogues. To develop an in silico simulation tool of Type 1 diabetes to predict long-term glycaemic control outcomes of clinical interventions.
Methods:
The virtual patient method is used to develop a simulation tool for Type 1 diabetes using data from a Type 1 diabetes patient cohort (n=40). The tool is used to test the adaptive protocol (AC) and a conventional intensive insulin therapy (CC) against results from a representative control cohort. Optimal and suboptimal basal insulin replacement are evaluated as a function of self-monitoring blood glucose (SMBG) frequency in conjunction with the (AC and CC) prandial control protocols.
Results:
In long-term glycaemic control, the AC protocol significantly decreases HbA1c in conditions of suboptimal basal insulin replacement for SMBG frequencies =6/day, and reduced the occurrence of mild and severe hypoglycaemia by 86-100% over controls over all SMBG frequencies in conditions of optimal basal insulin.
Conclusions:
A simulation tool to predict long-term glycaemic control outcomes from clinical interventions is developed to test a novel, adaptive control protocol for Type 1 diabetes. The protocol is effective and safe compared to conventional intensive insulin therapy and controls. As fear of hypoglycaemia is a large psychological barrier to glycaemic control, the AC protocol may represent the next evolution of intensive insulin therapy to deliver increased glycaemic control with increased safety. Further clinical or experimental validation is needed to fully prove the concept
Overview of Glycemic Control in Critical Care - Relating Performance and Clinical Results
Inagural review article invited for inaugural journalBackground: Hyperglycemia is prevalent in critical care and tight control can save
lives. Current ad-hoc clinical protocols require significant clinical effort and produce
highly variable results. Model-based methods can provide tight, patient specific
control, while addressing practical clinical difficulties and dynamic patient evolution.
However, tight control remains elusive as there is not enough understanding of the
relationship between control performance and clinical outcome.
Methods: The general problem and performance criteria are defined. The clinical
studies performed to date using both ad-hoc titration and model-based methods are
reviewed. Studies reporting mortality outcome are analysed in terms of standardized
mortality ratio (SMR) and a 95th percentile (±2 ) standard error (SE95%) to enable
better comparison across cohorts.
Results: Model-based control trials lower blood glucose into a 72-110mg/dL band
within 10 hours, have target accuracy over 90%, produce fewer hypoglycemic
episodes, and require no additional clinical intervention. Plotting SMR versus SE95%
shows potentially high correlation (r=0.84) between ICU mortality and tightness of
control.
Summary: Model-based methods provide tighter, more adaptable “one method fits
all” solutions, using methods that enable patient-specific modeling and control.
Correlation between tightness of control and clinical outcome suggests that
performance metrics, such as time in a relevant glycemic band, may provide better
guidelines. Overall, compared to current “one size fits all” sliding scale and ad-hoc
regimens, patient-specific pharmacodynamic and pharmacokinetic model-based, or
“one method fits all”, control, utilizing computational and emerging sensor
technologies, offers improved treatment and better potential outcomes when treating
hyperglycemia in the highly dynamic critically ill patient
Spectrophotometric analysis of sapstain caused by Botryodiplodia theobromae, Aureobasidium pullulans and Aspergillus niger on three rubberwood clones
The development of a more risk-sensitive and flexible airport safety area strategy: Part I. The development of an improved accident frequency model
This two-part paper presents the development of an improved airport risk assessment
methodology aimed at assessing risks related to aircraft accidents at and in the vicinity
of airports and managing Airport Safety Areas (ASAs) as a risk mitigation measure.
The improved methodology is more quantitative, risk-sensitive, flexible and
transparent than standard risk assessment approaches. As such, it contributes to the
implementation of Safety Management Systems at airports, as stipulated by the
International Civil Aviation Organisation.
The first part of the paper presents the methodological advances made in the
development of accident frequency models; namely the building of a single
comprehensive database of all relevant accident types, the collection and use of
normal operations data in quantifying the criticality of a series of risk factors, and
modelling accident frequency using multivariate logistic regression. The resulting
models have better goodness-of-fit, sensitivity and specificity than standard risk
assessment methodologies
- …