1,942 research outputs found

    The Sydney Triage to Admission Risk Tool (START) to predict Emergency Department Disposition: A derivation and internal validation study using retrospective state-wide data from New South Wales, Australia.

    Full text link
    BACKGROUND: Disposition decisions are critical to the functioning of Emergency Departments. The objectives of the present study were to derive and internally validate a prediction model for inpatient admission from the Emergency Department to assist with triage, patient flow and clinical decision making. METHODS: This was a retrospective analysis of State-wide Emergency Department data in New South Wales, Australia. Adult patients (age ≥ 16 years) were included if they presented to a Level five or six (tertiary level) Emergency Department in New South Wales, Australia between 2013 and 2014. The outcome of interest was in-patient admission from the Emergency Department. This included all admissions to short stay and medical assessment units and being transferred out to another hospital. Analyses were performed using logistic regression. Discrimination was assessed using area under curve and derived risk scores were plotted to assess calibration. RESULTS: 1,721,294 presentations from twenty three Level five or six hospitals were analysed. Of these 49.38% were male and the mean (sd) age was 49.85 years (22.13). Level 6 hospitals accounted for 47.70% of cases and 40.74% of cases were classified as an in-patient admission based on their mode of separation. The final multivariable model including age, arrival by ambulance, triage category, previous admission and presenting problem had an AUC of 0.82 (95% CI 0.81, 0.82). CONCLUSION: By deriving and internally validating a risk score model to predict the need for in-patient admission based on basic demographic and triage characteristics, patient flow in ED, clinical decision making and overall quality of care may be improved. Further studies are now required to establish clinical effectiveness of this risk score model

    Potential involvement of Brugia malayi cysteine proteases in the maintenance of the endosymbiotic relationship with Wolbachia

    Get PDF
    Brugia malayi, a parasitic nematode that causes lymphatic filariasis, harbors endosymbiotic intracellular bacteria, Wolbachia, that are required for the development and reproduction of the worm. The essential nature of this endosymbiosis led to the development of anti- Wolbachia chemotherapeutic approaches for the treatment of human filarial infections. Our study is aimed at identifying specific proteins that play a critical role in this endosymbiotic relationship leading to the identification of potential targets in the adult worms. Filarial cysteine proteases are known to be involved in molting and embryogenesis, processes shown to also be Wolbachia dependent. Based on the observation that cysteine protease transcripts are differentially regulated in response to tetracycline treatment, we focused on defining their role in symbiosis. We observe a bimodal regulation pattern of transcripts encoding cysteine proteases when in vitro tetracycline treated worms were examined. Using tetracycline-treated infertile female worms and purified embryos we established that the first peak of the bimodal pattern corresponds to embryonic transcripts while the second takes place within the hypodermis of the adult worms. Localization studies of the native proteins corresponding to Bm-cpl-3 and Bm-cpl-6 indicate that they are present in the area surrounding Wolbachia, and, in some cases, the proteins appear localized within the bacteria. Both proteins were also found in the inner bodies of microfilariae. The possible role of these cysteine proteases during development and endosymbiosis was further characterized using RNAi. Reduction in Bm-cpl-3 and Bm-cpl-6 transcript levels was accompanied by hindered microfilarial development and release, and reduced Wolbachia DNA levels, making these enzymes strong drug target candidates

    Clades and clans: a comparison study of two evolutionary models

    Get PDF
    The Yule-Harding-Kingman (YHK) model and the proportional to distinguishable arrangements (PDA) model are two binary tree generating models that are widely used in evolutionary biology. Understanding the distributions of clade sizes under these two models provides valuable insights into macro-evolutionary processes, and is important in hypothesis testing and Bayesian analyses in phylogenetics. Here we show that these distributions are log-convex, which implies that very large clades or very small clades are more likely to occur under these two models. Moreover, we prove that there exists a critical value κ(n)\kappa(n) for each n4n\geqslant 4 such that for a given clade with size kk, the probability that this clade is contained in a random tree with nn leaves generated under the YHK model is higher than that under the PDA model if 1<k<κ(n)1<k<\kappa(n), and lower if κ(n)<k<n\kappa(n)<k<n. Finally, we extend our results to binary unrooted trees, and obtain similar results for the distributions of clan sizes.Comment: 21page

    A Self-Reference False Memory Effect in the DRM Paradigm: Evidence from Eastern and Western Samples

    Get PDF
    It is well established that processing information in relation to oneself (i.e., selfreferencing) leads to better memory for that information than processing that same information in relation to others (i.e., other-referencing). However, it is unknown whether self-referencing also leads to more false memories than other-referencing. In the current two experiments with European and East Asian samples, we presented participants the Deese-Roediger/McDermott (DRM) lists together with their own name or other people’s name (i.e., “Trump” in Experiment 1 and “Li Ming” in Experiment 2). We found consistent results across the two experiments; that is, in the self-reference condition, participants had higher true and false memory rates compared to those in the other-reference condition. Moreover, we found that selfreferencing did not exhibit superior mnemonic advantage in terms of net accuracy compared to other-referencing and neutral conditions. These findings are discussed in terms of theoretical frameworks such as spreading activation theories and the fuzzytrace theory. We propose that our results reflect the adaptive nature of memory in the sense that cognitive processes that increase mnemonic efficiency may also increase susceptibility to associative false memories

    Anti-prion drug mPPIg5 inhibits PrP(C) conversion to PrP(Sc).

    Get PDF
    Prion diseases, also known as transmissible spongiform encephalopathies, are a group of fatal neurodegenerative diseases that include scrapie in sheep, bovine spongiform encephalopathy (BSE) in cattle and Creutzfeldt-Jakob disease (CJD) in humans. The 'protein only hypothesis' advocates that PrP(Sc), an abnormal isoform of the cellular protein PrP(C), is the main and possibly sole component of prion infectious agents. Currently, no effective therapy exists for these diseases at the symptomatic phase for either humans or animals, though a number of compounds have demonstrated the ability to eliminate PrPSc in cell culture models. Of particular interest are synthetic polymers known as dendrimers which possess the unique ability to eliminate PrP(Sc) in both an intracellular and in vitro setting. The efficacy and mode of action of the novel anti-prion dendrimer mPPIg5 was investigated through the creation of a number of innovative bio-assays based upon the scrapie cell assay. These assays were used to demonstrate that mPPIg5 is a highly effective anti-prion drug which acts, at least in part, through the inhibition of PrP(C) to PrP(Sc) conversion. Understanding how a drug works is a vital component in maximising its performance. By establishing the efficacy and method of action of mPPIg5, this study will help determine which drugs are most likely to enhance this effect and also aid the design of dendrimers with anti-prion capabilities for the future

    An evaluation of enteral nutrition practices and nutritional provision in children during the entire length of stay in critical care

    Get PDF
    &lt;b&gt;Background&lt;/b&gt; Provision of optimal nutrition in children in critical care is often challenging. This study evaluated exclusive enteral nutrition (EN) provision practices and explored predictors of energy intake and delay of EN advancement in critically ill children.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Methods&lt;/b&gt; Data on intake and EN practices were collected on a daily basis and compared against predefined targets and dietary reference values in a paediatric intensive care unit. Factors associated with intake and advancement of EN were explored.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Results&lt;/b&gt; Data were collected from 130 patients and 887 nutritional support days (NSDs). Delay to initiate EN was longer in patients from both the General Surgical and congenital heart defect (CHD) Surgical groups [Median (IQR); CHD Surgical group: 20.3 (16.4) vs General Surgical group: 11.4 (53.5) vs Medical group: 6.5 (10.9) hours; p &lt;= 0.001]. Daily fasting time per patient was significantly longer in patients from the General Surgical and CHD Surgical groups than those from the Medical group [% of 24 h, Median (IQR); CHD Surgical group: 24.0 (29.2) vs General Surgical group: 41.7 (66.7) vs Medical group: 9.4 (21.9); p &lt;= 0.001]. A lower proportion of fluids was delivered as EN per patient (45% vs 73%) or per NSD (56% vs 73%) in those from the CHD Surgical group compared with those with medical conditions. Protein and energy requirements were achieved in 38% and 33% of the NSDs. In a substantial proportion of NSDs, minimum micronutrient recommendations were not met particularly in those patients from the CHD Surgical group. A higher delivery of fluid requirements (p &lt; 0.05) and a greater proportion of these delivered as EN (p &lt; 0.001) were associated with median energy intake during stay and delay of EN advancement. Fasting (31%), fluid restriction (39%) for clinical reasons, procedures requiring feed cessation and establishing EN (22%) were the most common reasons why target energy requirements were not met.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Conclusions&lt;/b&gt; Provision of optimal EN support remains challenging and varies during hospitalisation and among patients. Delivery of EN should be prioritized over other "non-nutritional" fluids whenever this is possible.&lt;p&gt;&lt;/p&gt

    Practical mammography

    Get PDF
    ‘Digital health’ is an overarching concept that currently lacks theoretical definition and common terminology. For instance, this broad and emerging field includes all of the following terms within its lexicon: mHealth, Wireless Health, Health 2.0, eHealth, e-Patient(s), Healthcare IT/Health IT, Big Data, Health Data, Cloud Computing, Quantified Self, Wearable Computing, Gamification, and Telehealth/Telemedicine [1]. However, whilst a definition is difficult to provide, in this overview it is considered that digital health is the use of digital media to transform the way healthcare provision is conceived and delivered. We consider it does this through three basic features

    The thoughtful self

    Get PDF
    The relationship between a concept in the external world (e.g., the self), and its representation in cognition

    Effects of gestational age at birth on cognitive performance : a function of cognitive workload demands

    Get PDF
    Objective: Cognitive deficits have been inconsistently described for late or moderately preterm children but are consistently found in very preterm children. This study investigates the association between cognitive workload demands of tasks and cognitive performance in relation to gestational age at birth. Methods: Data were collected as part of a prospective geographically defined whole-population study of neonatal at-risk children in Southern Bavaria. At 8;5 years, n = 1326 children (gestation range: 23–41 weeks) were assessed with the K-ABC and a Mathematics Test. Results: Cognitive scores of preterm children decreased as cognitive workload demands of tasks increased. The relationship between gestation and task workload was curvilinear and more pronounced the higher the cognitive workload: GA2 (quadratic term) on low cognitive workload: R2 = .02, p<0.001; moderate cognitive workload: R2 = .09, p<0.001; and high cognitive workload tasks: R2 = .14, p<0.001. Specifically, disproportionally lower scores were found for very (<32 weeks gestation) and moderately (32–33 weeks gestation) preterm children the higher the cognitive workload of the tasks. Early biological factors such as gestation and neonatal complications explained more of the variance in high (12.5%) compared with moderate (8.1%) and low cognitive workload tasks (1.7%). Conclusions: The cognitive workload model may help to explain variations of findings on the relationship of gestational age with cognitive performance in the literature. The findings have implications for routine cognitive follow-up, educational intervention, and basic research into neuro-plasticity and brain reorganization after preterm birth
    corecore