3,048 research outputs found

    A continuous stirred tank heater simulation model with applications

    No full text
    This article presents a first principles simulation of a continuous stirred tank heater pilot plant at the University of Alberta. The model has heat and volumetric balances, and a very realistic feature is that instrument, actuator and process non-linearities have been carefully measured, for instance to take account of the volume occupied by heating coils in the tank. Experimental data from step testing and recordings of real disturbances are presented. The model in Simulink and the experimental data are available electronically, and some suggestions are given for their application in education, system identification, fault detection and diagnosis.© Elsevie

    White cell count in the normal range and short-term and long-term mortality: international comparisons of electronic health record cohorts in England and New Zealand

    Get PDF
    OBJECTIVES: Electronic health records offer the opportunity to discover new clinical implications for established blood tests, but international comparisons have been lacking. We tested the association of total white cell count (WBC) with all-cause mortality in England and New Zealand. SETTING: Primary care practices in England (ClinicAl research using LInked Bespoke studies and Electronic health Records (CALIBER)) and New Zealand (PREDICT). DESIGN: Analysis of linked electronic health record data sets: CALIBER (primary care, hospitalisation, mortality and acute coronary syndrome registry) and PREDICT (cardiovascular risk assessments in primary care, hospitalisations, mortality, dispensed medication and laboratory results). PARTICIPANTS: People aged 30-75 years with no prior cardiovascular disease (CALIBER: N=686 475, 92.0% white; PREDICT: N=194 513, 53.5% European, 14.7% Pacific, 13.4% Maori), followed until death, transfer out of practice (in CALIBER) or study end. PRIMARY OUTCOME MEASURE: HRs for mortality were estimated using Cox models adjusted for age, sex, smoking, diabetes, systolic blood pressure, ethnicity and total:high-density lipoprotein (HDL) cholesterol ratio. RESULTS: We found 'J'-shaped associations between WBC and mortality; the second quintile was associated with lowest risk in both cohorts. High WBC within the reference range (8.65-10.05×10(9)/L) was associated with significantly increased mortality compared to the middle quintile (6.25-7.25×10(9)/L); adjusted HR 1.51 (95% CI 1.43 to 1.59) in CALIBER and 1.33 (95% CI 1.06 to 1.65) in PREDICT. WBC outside the reference range was associated with even greater mortality. The association was stronger over the first 6 months of follow-up, but similar across ethnic groups. CONCLUSIONS: Clinically recorded WBC within the range considered 'normal' is associated with mortality in ethnically different populations from two countries, particularly within the first 6 months. Large-scale international comparisons of electronic health record cohorts might yield new insights from widely performed clinical tests. TRIAL REGISTRATION NUMBER: NCT02014610

    Epistasis not needed to explain low dN/dS

    Full text link
    An important question in molecular evolution is whether an amino acid that occurs at a given position makes an independent contribution to fitness, or whether its effect depends on the state of other loci in the organism's genome, a phenomenon known as epistasis. In a recent letter to Nature, Breen et al. (2012) argued that epistasis must be "pervasive throughout protein evolution" because the observed ratio between the per-site rates of non-synonymous and synonymous substitutions (dN/dS) is much lower than would be expected in the absence of epistasis. However, when calculating the expected dN/dS ratio in the absence of epistasis, Breen et al. assumed that all amino acids observed in a protein alignment at any particular position have equal fitness. Here, we relax this unrealistic assumption and show that any dN/dS value can in principle be achieved at a site, without epistasis. Furthermore, for all nuclear and chloroplast genes in the Breen et al. dataset, we show that the observed dN/dS values and the observed patterns of amino acid diversity at each site are jointly consistent with a non-epistatic model of protein evolution.Comment: This manuscript is in response to "Epistasis as the primary factor in molecular evolution" by Breen et al. Nature 490, 535-538 (2012

    Data Resource Profile: Cardiovascular disease research using linked bespoke studies and electronic health records (CALIBER)

    Get PDF
    The goal of cardiovascular disease (CVD) research using linked bespoke studies and electronic health records (CALIBER) is to provide evidence to inform health care and public health policy for CVDs across different stages of translation, from discovery, through evaluation in trials to implementation, where linkages to electronic health records provide new scientific opportunities. The initial approach of the CALIBER programme is characterized as follows: (i) Linkages of multiple electronic heath record sources: examples include linkages between the longitudinal primary care data from the Clinical Practice Research Datalink, the national registry of acute coronary syndromes (Myocardial Ischaemia National Audit Project), hospitalization and procedure data from Hospital Episode Statistics and cause-specific mortality and social deprivation data from the Office of National Statistics. Current cohort analyses involve a million people in initially healthy populations and disease registries with ∼105 patients. (ii) Linkages of bespoke investigator-led cohort studies (e.g. UK Biobank) to registry data (e.g. Myocardial Ischaemia National Audit Project), providing new means of ascertaining, validating and phenotyping disease. (iii) A common data model in which routine electronic health record data are made research ready, and sharable, by defining and curating with meta-data >300 variables (categorical, continuous, event) on risk factors, CVDs and non-cardiovascular comorbidities. (iv) Transparency: all CALIBER studies have an analytic protocol registered in the public domain, and data are available (safe haven model) for use subject to approvals. For more information, e-mail [email protected]

    Machine learning models in electronic health records can outperform conventional survival models for predicting patient mortality in coronary artery disease

    Get PDF
    Prognostic modelling is important in clinical practice and epidemiology for patient management and research. Electronic health records (EHR) provide large quantities of data for such models, but conventional epidemiological approaches require significant researcher time to implement. Expert selection of variables, fine-tuning of variable transformations and interactions, and imputing missing values are time-consuming and could bias subsequent analysis, particularly given that missingness in EHR is both high, and may carry meaning. Using a cohort of 80,000 patients from the CALIBER programme, we compared traditional modelling and machine-learning approaches in EHR. First, we used Cox models and random survival forests with and without imputation on 27 expert-selected, preprocessed variables to predict all-cause mortality. We then used Cox models, random forests and elastic net regression on an extended dataset with 586 variables to build prognostic models and identify novel prognostic factors without prior expert input. We observed that data-driven models used on an extended dataset can outperform conventional models for prognosis, without data preprocessing or imputing missing values. An elastic net Cox regression based with 586 unimputed variables with continuous values discretised achieved a C-index of 0.801 (bootstrapped 95% CI 0.799 to 0.802), compared to 0.793 (0.791 to 0.794) for a traditional Cox model comprising 27 expert-selected variables with imputation for missing values. We also found that data-driven models allow identification of novel prognostic variables; that the absence of values for particular variables carries meaning, and can have significant implications for prognosis; and that variables often have a nonlinear association with mortality, which discretised Cox models and random forests can elucidate. This demonstrates that machine-learning approaches applied to raw EHR data can be used to build models for use in research and clinical practice, and identify novel predictive variables and their effects to inform future research

    Prolonged dual anti-platelet therapy in stable coronary disease: a comparative observational study of benefits and harms in unselected versus trial populations

    Get PDF
    Objective: To estimate the potential magnitude in unselected patients of the benefits and harms of prolonged dual antiplatelet therapy after acute myocardial infarction seen in selected patients with high risk characteristics in trials. Design: Observational population based cohort study. Setting: PEGASUS-TIMI-54 trial population and CALIBER (ClinicAl research using LInked Bespoke studies and Electronic health Records). Participants: 7238 patients who survived a year or more after acute myocardial infarction. Interventions: Prolonged dual antiplatelet therapy after acute myocardial infarction. Main outcome measures: Recurrent acute myocardial infarction, stroke, or fatal cardiovascular disease. Fatal, severe, or intracranial bleeding. Results: 1676/7238 (23.1%) patients met trial inclusion and exclusion criteria (“target” population). Compared with the placebo arm in the trial population, in the target population the median age was 12 years higher, there were more women (48.6% v 24.3%), and there was a substantially higher cumulative three year risk of both the primary (benefit) trial endpoint of recurrent acute myocardial infarction, stroke, or fatal cardiovascular disease (18.8% (95% confidence interval 16.3% to 21.8%) v 9.04%) and the primary (harm) endpoint of fatal, severe, or intracranial bleeding (3.0% (2.0% to 4.4%) v 1.26% (TIMI major bleeding)). Application of intention to treat relative risks from the trial (ticagrelor 60 mg daily arm) to CALIBER’s target population showed an estimated 101 (95% confidence interval 87 to 117) ischaemic events prevented per 10 000 treated per year and an estimated 75 (50 to 110) excess fatal, severe, or intracranial bleeds caused per 10 000 patients treated per year. Generalisation from CALIBER’s target subgroup to all 7238 real world patients who were stable at least one year after acute myocardial infarction showed similar three year risks of ischaemic events (17.2%, 16.0% to 18.5%), with an estimated 92 (86 to 99) events prevented per 10 000 patients treated per year, and similar three year risks of bleeding events (2.3%, 1.8% to 2.9%), with an estimated 58 (45 to 73) events caused per 10 000 patients treated per year. Conclusions: This novel use of primary-secondary care linked electronic health records allows characterisation of “healthy trial participant” effects and confirms the potential absolute benefits and harms of dual antiplatelet therapy in representative patients a year or more after acute myocardial infarction

    Advanced AODV approach for efficient detection and mitigation of wormhole attack in MANET

    Get PDF
    Wireless Communication is an inevitable part of Smart Home domain. A Mobile Ad-Hoc Network (MANET) is defined as an arrangement of wireless mobile nodes which creates a temporary network for the communication. MANET suffers from both kinds of attacks, active and passive attacks at all the layers of the network model. The lacks of security measures of routing protocols allow attackers to intrude the network. Wormhole, the attack is generated by tunnels creation and it results in complete disruption of routing paths on MANET. The proposed security approach is to detect and mitigate wormhole attack. It is secured Ad hoc on demand distance vector (AODV) approach which efficiently finds wormhole attack present in a MANET and Digital signature is used to prevent it. This approach is based on a calculation of tunneling time taken by tunnel to analyze the behavior of wormhole. Afterward, it decides some static threshold value. Based upon this tunneling time and threshold value, it decides whether given node is wormhole node or trustworthy node. A digital signature and hash chain algorithm is applied to mitigate the wormhole node

    Evaluation of iris and iridociliary body lesions with anterior segment optical coherence tomography versus ultrasound B-scan.

    Get PDF
    To compare anterior segment optical coherence tomography (AS-OCT) with ultrasound B-scan (USB) in evaluating iris and iridociliary body lesions
    corecore