397 research outputs found
Constraining properties of GRB magnetar central engines using the observed plateau luminosity and duration correlation
An intrinsic correlation has been identified between the luminosity and
duration of plateaus in the X-ray afterglows of Gamma-Ray Bursts (GRBs;
Dainotti et al. 2008), suggesting a central engine origin. The magnetar central
engine model predicts an observable plateau phase, with plateau durations and
luminosities being determined by the magnetic fields and spin periods of the
newly formed magnetar. This paper analytically shows that the magnetar central
engine model can explain, within the 1 uncertainties, the correlation
between plateau luminosity and duration. The observed scatter in the
correlation most likely originates in the spread of initial spin periods of the
newly formed magnetar and provides an estimate of the maximum spin period of
~35 ms (assuming a constant mass, efficiency and beaming across the GRB
sample). Additionally, by combining the observed data and simulations, we show
that the magnetar emission is most likely narrowly beamed and has 20%
efficiency in conversion of rotational energy from the magnetar into the
observed plateau luminosity. The beaming angles and efficiencies obtained by
this method are fully consistent with both predicted and observed values. We
find that Short GRBs and Short GRBs with Extended Emission lie on the same
correlation but are statistically inconsistent with being drawn from the same
distribution as Long GRBs, this is consistent with them having a wider beaming
angle than Long GRBs.Comment: MNRAS Accepte
A multi-messenger model for neutron star - black hole mergers
We present a semi-analytic model for predicting kilonova light curves from
the mergers of neutron stars with black holes (NSBH). The model is integrated
into the MOSFiT platform, and can generate light curves from input binary
properties and nuclear equation-of-state considerations, or incorporate
measurements from gravitational wave (GW) detectors to perform multi-messenger
parameter estimation. The rapid framework enables the generation of NSBH
kilonova distributions from binary populations, light curve predictions from GW
data, and statistically meaningful comparisons with an equivalent BNS model in
MOSFiT. We investigate a sample of kilonova candidates associated with
cosmological short gamma-ray bursts, and demonstrate that they are broadly
consistent with being driven by NSBH systems, though most have limited data. We
also perform fits to the very well sampled GW170817, and show that the
inability of an NSBH merger to produce lanthanide-poor ejecta results in a
significant underestimate of the early (< 2 days) optical emission. Our model
indicates that NSBH-driven kilonovae may peak up to a week after merger at
optical wavelengths for some observer angles. This demonstrates the need for
early coverage of emergent kilonovae in cases where the GW signal is either
ambiguous or absent; they likely cannot be distinguished from BNS mergers by
the light curves alone from ~2 days after the merger. We also discuss the
detectability of our model kilonovae with the Vera C. Rubin Observatory's
Legacy Survey of Space and Time (LSST).Comment: 14 pages, 6 figures, 2 tables. Accepted for publication in MNRAS.
This is the author's final submitted version. The model code is available
through MOSFiT at https://github.com/guillochon/MOSFi
Interplay between distribution of live cells and growth dynamics of solid tumours
Experiments show that simple diffusion of nutrients and waste molecules is not sufficient to explain the typical multilayered structure of solid tumours, where an outer rim of proliferating cells surrounds a layer of quiescent but viable cells and a central necrotic region. These experiments challenge models of tumour growth based exclusively on diffusion. Here we propose a model of tumour growth that incorporates the volume dynamics and the distribution of cells within the viable cell rim. The model is suggested by in silico experiments and is validated using in vitro data. The results correlate with in vivo data as well, and the model can be used to support experimental and clinical oncology
Evidence for the Gompertz Curve in the Income Distribution of Brazil 1978-2005
This work presents an empirical study of the evolution of the personal income
distribution in Brazil. Yearly samples available from 1978 to 2005 were studied
and evidence was found that the complementary cumulative distribution of
personal income for 99% of the economically less favorable population is well
represented by a Gompertz curve of the form , where
is the normalized individual income. The complementary cumulative
distribution of the remaining 1% richest part of the population is well
represented by a Pareto power law distribution . This
result means that similarly to other countries, Brazil's income distribution is
characterized by a well defined two class system. The parameters , ,
, were determined by a mixture of boundary conditions,
normalization and fitting methods for every year in the time span of this
study. Since the Gompertz curve is characteristic of growth models, its
presence here suggests that these patterns in income distribution could be a
consequence of the growth dynamics of the underlying economic system. In
addition, we found out that the percentage share of both the Gompertzian and
Paretian components relative to the total income shows an approximate cycling
pattern with periods of about 4 years and whose maximum and minimum peaks in
each component alternate at about every 2 years. This finding suggests that the
growth dynamics of Brazil's economic system might possibly follow a
Goodwin-type class model dynamics based on the application of the
Lotka-Volterra equation to economic growth and cycle.Comment: 22 pages, 15 figures, 4 tables. LaTeX. Accepted for publication in
"The European Physical Journal B
When the optimal is not the best: parameter estimation in complex biological models
Background: The vast computational resources that became available during the
past decade enabled the development and simulation of increasingly complex
mathematical models of cancer growth. These models typically involve many free
parameters whose determination is a substantial obstacle to model development.
Direct measurement of biochemical parameters in vivo is often difficult and
sometimes impracticable, while fitting them under data-poor conditions may
result in biologically implausible values.
Results: We discuss different methodological approaches to estimate
parameters in complex biological models. We make use of the high computational
power of the Blue Gene technology to perform an extensive study of the
parameter space in a model of avascular tumor growth. We explicitly show that
the landscape of the cost function used to optimize the model to the data has a
very rugged surface in parameter space. This cost function has many local
minima with unrealistic solutions, including the global minimum corresponding
to the best fit.
Conclusions: The case studied in this paper shows one example in which model
parameters that optimally fit the data are not necessarily the best ones from a
biological point of view. To avoid force-fitting a model to a dataset, we
propose that the best model parameters should be found by choosing, among
suboptimal parameters, those that match criteria other than the ones used to
fit the model. We also conclude that the model, data and optimization approach
form a new complex system, and point to the need of a theory that addresses
this problem more generally
Incorporating prior knowledge improves detection of differences in bacterial growth rate
BACKGROUND: Robust statistical detection of differences in the bacterial growth rate can be challenging, particularly when dealing with small differences or noisy data. The Bayesian approach provides a consistent framework for inferring model parameters and comparing hypotheses. The method captures the full uncertainty of parameter values, whilst making effective use of prior knowledge about a given system to improve estimation. RESULTS: We demonstrated the application of Bayesian analysis to bacterial growth curve comparison. Following extensive testing of the method, the analysis was applied to the large dataset of bacterial responses which are freely available at the web-resource, ComBase. Detection was found to be improved by using prior knowledge from clusters of previously analysed experimental results at similar environmental conditions. A comparison was also made to a more traditional statistical testing method, the F-test, and Bayesian analysis was found to perform more conclusively and to be capable of attributing significance to more subtle differences in growth rate. CONCLUSIONS: We have demonstrated that by making use of existing experimental knowledge, it is possible to significantly improve detection of differences in bacterial growth rate
Exact Solution of an Evolutionary Model without Ageing
We introduce an age-structured asexual population model containing all the
relevant features of evolutionary ageing theories. Beneficial as well as
deleterious mutations, heredity and arbitrary fecundity are present and managed
by natural selection. An exact solution without ageing is found. We show that
fertility is associated with generalized forms of the Fibonacci sequence, while
mutations and natural selection are merged into an integral equation which is
solved by Fourier series. Average survival probabilities and Malthusian growth
exponents are calculated indicating that the system may exhibit mutational
meltdown. The relevance of the model in the context of fissile reproduction
groups as many protozoa and coelenterates is discussed.Comment: LaTeX file, 15 pages, 2 ps figures, to appear in Phys. Rev.
“It’s hard to tell”. The challenges of scoring patients on standardised outcome measures by multidisciplinary teams: a case study of Neurorehabilitation
Background
Interest is increasing in the application of standardised outcome measures in clinical practice. Measures designed for use in research may not be sufficiently precise to be used in monitoring individual patients. However, little is known about how clinicians and in particular, multidisciplinary teams, score patients using these measures. This paper explores the challenges faced by multidisciplinary teams in allocating scores on standardised outcome measures in clinical practice.
Methods
Qualitative case study of an inpatient neurorehabilitation team who routinely collected standardised outcome measures on their patients. Data were collected using non participant observation, fieldnotes and tape recordings of 16 multidisciplinary team meetings during which the measures were recited and scored. Eleven clinicians from a range of different professions were also interviewed. Data were analysed used grounded theory techniques.
Results
We identified a number of instances where scoring the patient was 'problematic'. In 'problematic' scoring, the scores were uncertain and subject to revision and adjustment. They sometimes required negotiation to agree on a shared understanding of concepts to be measured and the guidelines for scoring. Several factors gave rise to this problematic scoring. Team members' knowledge about patients' problems changed over time so that initial scores had to be revised or dismissed, creating an impression of deterioration when none had occurred. Patients had complex problems which could not easily be distinguished from each other and patients themselves varied in their ability to perform tasks over time and across different settings. Team members from different professions worked with patients in different ways and had different perspectives on patients' problems. This was particularly an issue in the scoring of concepts such as anxiety, depression, orientation, social integration and cognitive problems.
Conclusion
From a psychometric perspective these problems would raise questions about the validity, reliability and responsiveness of the scores. However, from a clinical perspective, such characteristics are an inherent part of clinical judgement and reasoning. It is important to highlight the challenges faced by multidisciplinary teams in scoring patients on standardised outcome measures but it would be unwarranted to conclude that such challenges imply that these measures should not be used in clinical practice for decision making about individual patients. However, our findings do raise some concerns about the use of such measures for performance management
Inflammation in sputum relates to progression of disease in subjects with COPD: a prospective descriptive study
BACKGROUND: Inflammation is considered to be of primary pathogenic importance in COPD but the evidence on which current understanding is based does not distinguish between cause and effect, and no single mechanism can account for the complex pathology. We performed a prospective longitudinal study of subjects with COPD that related markers of sputum inflammation at baseline to subsequent disease progression. METHODS: A cohort of 56 patients with chronic bronchitis was characterized in the stable state at baseline and after an interval of four years, using physiological measures and CT densitometry. Sputum markers of airway inflammation were quantified at baseline from spontaneously produced sputum in a sub-group (n = 38), and inflammation severity was related to subsequent disease progression. RESULTS: Physiological and CT measures indicated disease progression in the whole group. In the sub-group, sputum myeloperoxidase correlated with decline in FEV(1 )(rs = -0.344, p = 0.019, n = 37). LTB4 and albumin leakage correlated with TLCO decline (rs = -0.310, p = 0.033, rs = -0.401, p = 0.008, respectively, n = 35) and IL-8 correlated with progression of lung densitometric indices (rs = -0.464, p = 0.005, n = 38). CONCLUSION: The data support a principal causative role for neutrophilic inflammation in the pathogenesis of COPD and suggest that the measurement of sputum inflammatory markers may have a predictive role in clinical practice
- …