50 research outputs found

    Thickening of galactic disks through clustered star formation

    Full text link
    (Abridged) The building blocks of galaxies are star clusters. These form with low-star formation efficiencies and, consequently, loose a large part of their stars that expand outwards once the residual gas is expelled by the action of the massive stars. Massive star clusters may thus add kinematically hot components to galactic field populations. This kinematical imprint on the stellar distribution function is estimated here by calculating the velocity distribution function for ensembles of star-clusters distributed as power-law or log-normal initial cluster mass functions (ICMFs). The resulting stellar velocity distribution function is non-Gaussian and may be interpreted as being composed of multiple kinematical sub-populations. The notion that the formation of star-clusters may add hot kinematical components to a galaxy is applied to the age--velocity-dispersion relation of the Milky Way disk to study the implied history of clustered star formation, with an emphasis on the possible origin of the thick disk.Comment: MNRAS, accepted, 27 pages, 9 figure

    Bayesian history matching of complex infectious disease models using emulation: A tutorial and a case study on HIV in Uganda

    Get PDF
    Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator's input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator's behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs

    Kalman tracking of linear predictor and harmonic noise models for noisy speech enhancement

    Get PDF
    This paper presents a speech enhancement method based on the tracking and denoising of the formants of a linear prediction (LP) model of the spectral envelope of speech and the parameters of a harmonic noise model (HNM) of its excitation. The main advantages of tracking and denoising the prominent energy contours of speech are the efficient use of the spectral and temporal structures of successive speech frames and a mitigation of processing artefact known as the ‘musical noise’ or ‘musical tones’.The formant-tracking linear prediction (FTLP) model estimation consists of three stages: (a) speech pre-cleaning based on a spectral amplitude estimation, (b) formant-tracking across successive speech frames using the Viterbi method, and (c) Kalman filtering of the formant trajectories across successive speech frames.The HNM parameters for the excitation signal comprise; voiced/unvoiced decision, the fundamental frequency, the harmonics’ amplitudes and the variance of the noise component of excitation. A frequency-domain pitch extraction method is proposed that searches for the peak signal to noise ratios (SNRs) at the harmonics. For each speech frame several pitch candidates are calculated. An estimate of the pitch trajectory across successive frames is obtained using a Viterbi decoder. The trajectories of the noisy excitation harmonics across successive speech frames are modeled and denoised using Kalman filters.The proposed method is used to deconstruct noisy speech, de-noise its model parameters and then reconstitute speech from its cleaned parts. Experimental evaluations show the performance gains of the formant tracking, pitch extraction and noise reduction stages

    Improving ART programme retention and viral suppression are key to maximising impact of treatment as prevention - a modelling study.

    Get PDF
    BACKGROUND: UNAIDS calls for fewer than 500,000 new HIV infections/year by 2020, with treatment-as-prevention being a key part of their strategy for achieving the target. A better understanding of the contribution to transmission of people at different stages of the care pathway can help focus intervention services at populations where they may have the greatest effect. We investigate this using Uganda as a case study. METHODS: An individual-based HIV/ART model was fitted using history matching. 100 model fits were generated to account for uncertainties in sexual behaviour, HIV epidemiology, and ART coverage up to 2015 in Uganda. A number of different ART scale-up intervention scenarios were simulated between 2016 and 2030. The incidence and proportion of transmission over time from people with primary infection, post-primary ART-naïve infection, and people currently or previously on ART was calculated. RESULTS: In all scenarios, the proportion of transmission by ART-naïve people decreases, from 70% (61%-79%) in 2015 to between 23% (15%-40%) and 47% (35%-61%) in 2030. The proportion of transmission by people on ART increases from 7.8% (3.5%-13%) to between 14% (7.0%-24%) and 38% (21%-55%). The proportion of transmission by ART dropouts increases from 22% (15%-33%) to between 31% (23%-43%) and 56% (43%-70%). CONCLUSIONS: People who are currently or previously on ART are likely to play an increasingly large role in transmission as ART coverage increases in Uganda. Improving retention on ART, and ensuring that people on ART remain virally suppressed, will be key in reducing HIV incidence in Uganda

    Learning of model discrepancy for structural dynamics applications using Bayesian history matching

    Get PDF
    Calibration of computer models for structural dynamics is often an important task in creating valid predictions that match observational data. However, calibration alone will lead to biased estimates of system parameters when a mechanism for model discrepancy is not included. The definition of model discrepancy is the mismatch between observational data and the model when the 'true' parameters are known. This will occur due to the absence and/or simplification of certain physics in the computer model. Bayesian History Matching (BHM) is a 'likelihood-free' method for obtaining calibrated outputs whilst accounting for model discrepancies, typically via an additional variance term. The approach assesses the input space, using an emulator of the complex computer model, and identifies parameter sets that could have plausibly generated the target outputs. In this paper a more informative methodology is outlined where the functional form of the model discrepancy is inferred, improving predictive performance. The algorithm is applied to a case study for a representative five storey building structure with the objective of calibrating outputs of a finite element (FE) model. The results are discussed with appropriate validation metrics that consider the complete distribution

    Nitrosative and Oxidative Stresses Contribute to Post-Ischemic Liver Injury Following Severe Hemorrhagic Shock: The Role of Hypoxemic Resuscitation

    Get PDF
    Purpose: Hemorrhagic shock and resuscitation is frequently associated with liver ischemia-reperfusion injury. The aim of the study was to investigate whether hypoxemic resuscitation attenuates liver injury. Methods: Anesthetized, mechanically ventilated New Zealand white rabbits were exsanguinated to a mean arterial pressure of 30 mmHg for 60 minutes. Resuscitation under normoxemia (Normox-Res group, n = 16, PaO2 = 95–105 mmHg) or hypoxemia (Hypox-Res group, n = 15, PaO 2 = 35–40 mmHg) followed, modifying the FiO 2. Animals not subjected to shock constituted the sham group (n = 11, PaO 2 = 95–105 mmHg). Indices of the inflammatory, oxidative and nitrosative response were measured and histopathological and immunohistochemical studies of the liver were performed. Results: Normox-Res group animals exhibited increased serum alanine aminotransferase, tumor necrosis factor- alpha, interleukin (IL)-1b and IL-6 levels compared with Hypox-Res and sham groups. Reactive oxygen species generation, malondialdehyde formation and myeloperoxidase activity were all elevated in Normox-Res rabbits compared with Hypox-Res and sham groups. Similarly, endothelial NO synthase and inducible NO synthase mRNA expression was up-regulated and nitrotyrosine immunostaining increased in animals resuscitated normoxemically, indicating a more intense nitrosative stress. Hypox-Res animals demonstrated a less prominent histopathologic injury which was similar to sham animals. Conclusions: Hypoxemic resuscitation prevents liver reperfusion injury through attenuation of the inflammatory respons

    AutoEPG: software for the analysis of electrical activity in the microcircuit underpinning feeding behaviour of caenorhabditis elegans

    Get PDF
    BackgroundThe pharyngeal microcircuit of the nematode Caenorhabditis elegans serves as a model for analysing neural network activity and is amenable to electrophysiological recording techniques. One such technique is the electropharyngeogram (EPG) which has provided insight into the genetic basis of feeding behaviour, neurotransmission and muscle excitability. However, the detailed manual analysis of the digital recordings necessary to identify subtle differences in activity that reflect modulatory changes within the underlying network is time consuming and low throughput. To address this we have developed an automated system for the high-throughput and discrete analysis of EPG recordings (AutoEPG).Methodology/Principal FindingsAutoEPG employs a tailor made signal processing algorithm that automatically detects different features of the EPG signal including those that report on the relaxation and contraction of the muscle and neuronal activity. Manual verification of the detection algorithm has demonstrated AutoEPG is capable of very high levels of accuracy. We have further validated the software by analysing existing mutant strains with known pharyngeal phenotypes detectable by the EPG. In doing so, we have more precisely defined an evolutionarily conserved role for the calcium-dependent potassium channel, SLO-1, in modulating the rhythmic activity of neural networks.Conclusions/SignificanceAutoEPG enables the consistent analysis of EPG recordings, significantly increases analysis throughput and allows the robust identification of subtle changes in the electrical activity of the pharyngeal nervous system. It is anticipated that AutoEPG will further add to the experimental tractability of the C. elegans pharynx as a model neural circuit

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    The effect of the nugget on Gaussian process emulators of computer models

    Full text link
    The effect of a Gaussian process parameter known as the nugget, on the development of computer model emulators is investigated. The presence of the nugget results in an emulator that does not interpolate the data and attaches a non-zero uncertainty bound around them. The limits of this approximation are investigated theoretically, and it is shown that they can be as large as those of a least squares model with the same regression functions as the emulator, regardless of the nugget’s value. The likelihood of the correlation function parameters is also studied and two mode types are identified. Type I modes are characterised by an approximation error that is a function of the nugget and can therefore become arbitrarily small, effectively yielding an interpolating emulator. Type II modes result in emulators with a constant approximation error. Apart from a theoretical investigation of the limits of the approximation error, a practical method for automatically imposing restrictions on its extent is introduced. This is achieved by means of a penalty term that is added to the likelihood function, and controls the amount of unexplainable variability in the computer model. The main findings are illustrated on data from an Energy Balance climate model
    corecore