800 research outputs found

    Antimicrobial resistance in Mycoplasma genitalium sampled from the British general population

    Get PDF
    Background: Mycoplasma genitalium is a common sexually transmitted infection. Treatment guidelines focus on those with symptoms and sexual contacts, generally with regimens including doxycycline and/or azithromycin as first-line and moxifloxacin as second-line treatment. We investigated the prevalence of antimicrobial resistance (AMR)-conferring mutations in M. genitalium among the sexually-active British general population. / Methods: The third national survey of sexual attitudes and lifestyles (Natsal-3) is a probability sample survey of 15 162 men and women aged 16–74 years in Britain conducted during 2010–12. Urine test results for M. genitalium were available for 4507 participants aged 16–44 years reporting >1 lifetime sexual partner. In this study, we sequenced regions of the 23S rRNA and parC genes to detect known genotypic determinants for resistance to macrolides and fluoroquinolones respectively. / Results: 94% (66/70) of specimens were re-confirmed as M. genitalium positive, with successful sequencing in 85% (56/66) for 23S rRNA and 92% (61/66) for parC genes. Mutations in 23S rRNA gene (position A2058/A2059) were detected in 16.1% (95%CI: 8.6% to 27.8%) and in parC (encoding ParC D87N/D87Y) in 3.3% (0.9%–11.2%). Macrolide resistance was more likely in participants reporting STI diagnoses (past 5 years) (44.4% (18.9%–73.3%) vs 10.6% (4.6%–22.6%); p=0.029) or sexual health clinic attendance (past year) (43.8% (23.1%–66.8%) vs 5.0% (1.4%–16.5%); p=0.001). All 11 participants with AMR-conferring mutations had attended sexual health clinics (past 5 years), but none reported recent symptoms. / Conclusions This study highlights challenges in M. genitalium management and control. Macrolide resistance was present in one in six specimens from the general population in 2010–2012, but no participants with AMR M. genitalium reported symptoms. Given anticipated increases in diagnostic testing, new strategies including novel antimicrobials, AMR-guided therapy, and surveillance of AMR and treatment failure are recommended

    Sequential Quasi-Monte Carlo

    Full text link
    We derive and study SQMC (Sequential Quasi-Monte Carlo), a class of algorithms obtained by introducing QMC point sets in particle filtering. SQMC is related to, and may be seen as an extension of, the array-RQMC algorithm of L'Ecuyer et al. (2006). The complexity of SQMC is O(NlogN)O(N \log N), where NN is the number of simulations at each iteration, and its error rate is smaller than the Monte Carlo rate OP(N1/2)O_P(N^{-1/2}). The only requirement to implement SQMC is the ability to write the simulation of particle xtnx_t^n given xt1nx_{t-1}^n as a deterministic function of xt1nx_{t-1}^n and a fixed number of uniform variates. We show that SQMC is amenable to the same extensions as standard SMC, such as forward smoothing, backward smoothing, unbiased likelihood evaluation, and so on. In particular, SQMC may replace SMC within a PMCMC (particle Markov chain Monte Carlo) algorithm. We establish several convergence results. We provide numerical evidence that SQMC may significantly outperform SMC in practical scenarios.Comment: 55 pages, 10 figures (final version

    Model for resource allocation in decentralized networks using Interaction nets

    Get PDF
    This article presents the description of a model for allocating resources using Interaction Nets and a strategy for playing public goods. In the description of the model first shows the behavior of the allocation of resources towards the nodes depending on the usefulness of the network and the satisfaction of the agents. Then the generalization of the model with Interaction Nets is described, and a simulation of this behavior is made. It is found that there is an emerging behavior condition in the dynamics of the interaction when assigning resources. To test the model, the interaction of sharing the Internet in an ad hoc network is done. The interaction is shown in the general model obtained

    Chaste: an open source C++ library for computational physiology and biology

    Get PDF
    Chaste - Cancer, Heart And Soft Tissue Environment - is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to "re-invent the wheel" with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials

    Assessing the value of intangible benefits of property level flood risk adaptation (PLFRA) measures

    Get PDF
    © 2015, Springer Science+Business Media Dordrecht. Studies in the UK and elsewhere have identified that flooding can result in diverse impacts, ranging from significant financial costs (tangible) to social (intangible) impacts on households. At the same time, it is now clear that large-scale flood defence schemes are not the panacea to flood risk, and there is an increasing responsibility on property owners to protect their own properties. Hence, there is an emerging expectation for homeowners to take action in the form of investing in property level flood risk adaptation (PLFRA) measures to protect their properties. However, hitherto the level of uptake of such measures remains very low. The tangible financial benefits of investing in PLFRA measures are generally well understood and have been demonstrated to be cost beneficial for many properties at risk from frequent flooding. Importantly, these estimates tend to take little account of the value of the intangible benefits of PLFRA measures and therefore may be under estimating their full benefits. There remains a need to develop an improved understanding of these intangible benefits, and this research sets out to bridge this knowledge gap. Based on a synthesis of the literature, the contingent valuation method was selected as a means to value intangible impacts of flooding on households. A questionnaire survey of homeowners affected in the 2007 flooding was employed to elicit willingness to pay (WTP) values to avoid the intangible impacts of flooding on their households. The analysis of the questionnaire survey data revealed that the average WTP per household per year to avoid intangible flood impacts was £653. This therefore represents the value of the intangible benefits of investing in PLFRA measures and is significantly higher than previously estimated. This research builds on previous research in suggesting a higher value to the intangible impacts of flooding on households by assessing wider range of intangible impacts and focussing on more experienced individuals. Furthermore, the research indicates that factors which influence the WTP values were principally stress of flood, worrying about loss of house values, worrying about future flooding and age of respondents, with income showing a weak correlation. The establishment of a new value for the intangible impacts of flooding on households in the UK is helpful in the domain of flood risk management when evaluating the total benefits (tangible and intangible) of investing in flood protection measures, thus providing a robust assessment for decision-making on flood adaptation measures at an individual property level

    Estimates of live-tree carbon stores in the Pacific Northwest are sensitive to model selection

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Estimates of live-tree carbon stores are influenced by numerous uncertainties. One of them is model-selection uncertainty: one has to choose among multiple empirical equations and conversion factors that can be plausibly justified as locally applicable to calculate the carbon store from inventory measurements such as tree height and diameter at breast height (DBH). Here we quantify the model-selection uncertainty for the five most numerous tree species in six counties of northwest Oregon, USA.</p> <p>Results</p> <p>The results of our study demonstrate that model-selection error may introduce 20 to 40% uncertainty into a live-tree carbon estimate, possibly making this form of error the largest source of uncertainty in estimation of live-tree carbon stores. The effect of model selection could be even greater if models are applied beyond the height and DBH ranges for which they were developed.</p> <p>Conclusions</p> <p>Model-selection uncertainty is potentially large enough that it could limit the ability to track forest carbon with the precision and accuracy required by carbon accounting protocols. Without local validation based on detailed measurements of usually destructively sampled trees, it is very difficult to choose the best model when there are several available. Our analysis suggests that considering tree form in equation selection may better match trees to existing equations and that substantial gaps exist, in terms of both species and diameter ranges, that are ripe for new model-building effort.</p

    Measurements of fiducial and differential cross sections for Higgs boson production in the diphoton decay channel at s√=8 TeV with ATLAS

    Get PDF
    Measurements of fiducial and differential cross sections are presented for Higgs boson production in proton-proton collisions at a centre-of-mass energy of s√=8 TeV. The analysis is performed in the H → γγ decay channel using 20.3 fb−1 of data recorded by the ATLAS experiment at the CERN Large Hadron Collider. The signal is extracted using a fit to the diphoton invariant mass spectrum assuming that the width of the resonance is much smaller than the experimental resolution. The signal yields are corrected for the effects of detector inefficiency and resolution. The pp → H → γγ fiducial cross section is measured to be 43.2 ±9.4(stat.) − 2.9 + 3.2 (syst.) ±1.2(lumi)fb for a Higgs boson of mass 125.4GeV decaying to two isolated photons that have transverse momentum greater than 35% and 25% of the diphoton invariant mass and each with absolute pseudorapidity less than 2.37. Four additional fiducial cross sections and two cross-section limits are presented in phase space regions that test the theoretical modelling of different Higgs boson production mechanisms, or are sensitive to physics beyond the Standard Model. Differential cross sections are also presented, as a function of variables related to the diphoton kinematics and the jet activity produced in the Higgs boson events. The observed spectra are statistically limited but broadly in line with the theoretical expectations

    Search for squarks and gluinos in events with isolated leptons, jets and missing transverse momentum at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for supersymmetry in final states containing at least one isolated lepton (electron or muon), jets and large missing transverse momentum with the ATLAS detector at the Large Hadron Collider are reported. The search is based on proton-proton collision data at a centre-of-mass energy s√=8 TeV collected in 2012, corresponding to an integrated luminosity of 20 fb−1. No significant excess above the Standard Model expectation is observed. Limits are set on supersymmetric particle masses for various supersymmetric models. Depending on the model, the search excludes gluino masses up to 1.32 TeV and squark masses up to 840 GeV. Limits are also set on the parameters of a minimal universal extra dimension model, excluding a compactification radius of 1/R c = 950 GeV for a cut-off scale times radius (ΛR c) of approximately 30
    corecore