814 research outputs found

    Assessment of a self-reported Drinks Diary for the estimation of drinks intake by care home residents: Fluid Intake Study in the Elderly (FISE)

    Get PDF
    Objectives: We evaluated the accuracy of a newly developed self-completed Drinks Diary in care home residents and compared it with direct observation and fluid intake charts. Design: Observational study. Setting: Residential care homes in Norfolk, UK. Participants: 22 elderly people (18 women, mean age 86.6 years SD 8.6, 12 with MMSE scores <27). Measurements: Participants recorded their own drinks intake over 24 hours using the Drinks Diary while care staff used the homes’ usual fluid intake chart to record drinks intake. These records were compared with drinks intake assessed by researcher direct observation (reference method), during waking hours (6am to 10pm), while drinks taken from 10pm to 6am were self-reported and checked with staff. Results: Drinks intake assessed by the Drinks Diary was highly correlated with researcher direct observation (Pearson correlation coefficient r=0.93, p<0.001, mean difference -163ml/day) while few staff-completed fluid charts were returned and correlation was low (r=0.122, p=0.818, mean difference 702ml/day). The Drinks Diary classified 19 of 22 participants correctly as drinking enough or not using both the European Food Safety Authority and US recommendations. Conclusion: The Drinks Diary estimate of drinks intake was comparable with direct observation and more accurate (and reliably completed) than staff records. The Drinks Diary can provide a reliable estimate of drinks intake in elderly care home residents physically and cognitively able to complete it. It may be useful for researchers, care staff and practitioners needing to monitor drinks intake of elderly people, to help them avoid dehydration

    Temporal and Geographic variation in the validity and internal consistency of the Nursing Home Resident Assessment Minimum Data Set 2.0

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Minimum Data Set (MDS) for nursing home resident assessment has been required in all U.S. nursing homes since 1990 and has been universally computerized since 1998. Initially intended to structure clinical care planning, uses of the MDS expanded to include policy applications such as case-mix reimbursement, quality monitoring and research. The purpose of this paper is to summarize a series of analyses examining the internal consistency and predictive validity of the MDS data as used in the "real world" in all U.S. nursing homes between 1999 and 2007.</p> <p>Methods</p> <p>We used person level linked MDS and Medicare denominator and all institutional claim files including inpatient (hospital and skilled nursing facilities) for all Medicare fee-for-service beneficiaries entering U.S. nursing homes during the period 1999 to 2007. We calculated the sensitivity and positive predictive value (PPV) of diagnoses taken from Medicare hospital claims and from the MDS among all new admissions from hospitals to nursing homes and the internal consistency (alpha reliability) of pairs of items within the MDS that logically should be related. We also tested the internal consistency of commonly used MDS based multi-item scales and examined the predictive validity of an MDS based severity measure viz. one year survival. Finally, we examined the correspondence of the MDS discharge record to hospitalizations and deaths seen in Medicare claims, and the completeness of MDS assessments upon skilled nursing facility (SNF) admission.</p> <p>Results</p> <p>Each year there were some 800,000 new admissions directly from hospital to US nursing homes and some 900,000 uninterrupted SNF stays. Comparing Medicare enrollment records and claims with MDS records revealed reasonably good correspondence that improved over time (by 2006 only 3% of deaths had no MDS discharge record, only 5% of SNF stays had no MDS, but over 20% of MDS discharges indicating hospitalization had no associated Medicare claim). The PPV and sensitivity levels of Medicare hospital diagnoses and MDS based diagnoses were between .6 and .7 for major diagnoses like CHF, hypertension, diabetes. Internal consistency, as measured by PPV, of the MDS ADL items with other MDS items measuring impairments and symptoms exceeded .9. The Activities of Daily Living (ADL) long form summary scale achieved an alpha inter-consistency level exceeding .85 and multi-item scale alpha levels of .65 were achieved for well being and mood, and .55 for behavior, levels that were sustained even after stratification by ADL and cognition. The Changes in Health, End-stage disease and Symptoms and Signs (CHESS) index, a summary measure of frailty was highly predictive of one year survival.</p> <p>Conclusion</p> <p>The MDS demonstrates a reasonable level of consistency both in terms of how well MDS diagnoses correspond to hospital discharge diagnoses and in terms of the internal consistency of functioning and behavioral items. The level of alpha reliability and validity demonstrated by the scales suggest that the data can be useful for research and policy analysis. However, while improving, the MDS discharge tracking record should still not be used to indicate Medicare hospitalizations or mortality. It will be important to monitor the performance of the MDS 3.0 with respect to consistency, reliability and validity now that it has replaced version 2.0, using these results as a baseline that should be exceeded.</p

    The Effect of Diet Quality and Wing Morph on Male and Female Reproductive Investment in a Nuptial Feeding Ground Cricket

    Get PDF
    A common approach in the study of life-history trade-off evolution is to manipulate the nutrient content of diets during the life of an individual in order observe how the acquisition of resources influences the relationship between reproduction, lifespan and other life-history parameters such as dispersal. Here, we manipulate the quality of diet that replicate laboratory populations received as a thorough test of how diet quality influences the life-history trade-offs associated with reproductive investment in a nuptial feeding Australian ground cricket (Pteronemobius sp.). In this species, both males and females make significant contributions to the production of offspring, as males provide a nuptial gift by allowing females to chew on a modified tibial spur during copulation and feed directing on their haemolymph. Individuals also have two distinct wing morphs, a short-winged flightless morph and a long-winged morph that has the ability to disperse. By manipulating the quality of diet over seven generations, we found that the reproductive investment of males and females were affected differently by the diet quality treatment and wing morph of the individual. We discuss the broader implications of these findings including the differences in how males and females balance current and future reproductive effort in nuptial feeding insects, the changing nature of sexual selection when diets vary, and how the life-history trade-offs associated with the ability to disperse are expected to differ among populations

    Ezrin interacts with the SARS coronavirus spike protein and restrains infection at the entry stage

    Get PDF
    © 2012 Millet et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Background: Entry of Severe Acute Respiratory Syndrome coronavirus (SARS-CoV) and its envelope fusion with host cell membrane are controlled by a series of complex molecular mechanisms, largely dependent on the viral envelope glycoprotein Spike (S). There are still many unknowns on the implication of cellular factors that regulate the entry process. Methodology/Principal Findings: We performed a yeast two-hybrid screen using as bait the carboxy-terminal endodomain of S, which faces the cytosol during and after opening of the fusion pore at early stages of the virus life cycle. Here we show that the ezrin membrane-actin linker interacts with S endodomain through the F1 lobe of its FERM domain and that both the eight carboxy-terminal amino-acids and a membrane-proximal cysteine cluster of S endodomain are important for this interaction in vitro. Interestingly, we found that ezrin is present at the site of entry of S-pseudotyped lentiviral particles in Vero E6 cells. Targeting ezrin function by small interfering RNA increased S-mediated entry of pseudotyped particles in epithelial cells. Furthermore, deletion of the eight carboxy-terminal amino acids of S enhanced S-pseudotyped particles infection. Expression of the ezrin dominant negative FERM domain enhanced cell susceptibility to infection by SARS-CoV and S pseudotyped particles and potentiated S-dependent membrane fusion. Conclusions/Significance: Ezrin interacts with SARS-CoV S endodomain and limits virus entry and fusion. Our data present a novel mechanism involving a cellular factor in the regulation of S-dependent early events of infection.This work was supported by the Research Grant Council of Hong Kong (RGC#760208)and the RESPARI project of the International Network of Pasteur Institutes

    Genomic Expansion of Magnetotactic Bacteria Reveals an Early Common Origin of Magnetotaxis with Lineage-specific Evolution

    Get PDF
    The origin and evolution of magnetoreception, which in diverse prokaryotes and protozoa is known as magnetotaxis and enables these microorganisms to detect Earth’s magnetic field for orientation and navigation, is not well understood in evolutionary biology. The only known prokaryotes capable of sensing the geomagnetic field are magnetotactic bacteria (MTB), motile microorganisms that biomineralize intracellular, membrane-bounded magnetic single-domain crystals of either magnetite (Fe3O4) or greigite (Fe3S4) called magnetosomes. Magnetosomes are responsible for magnetotaxis in MTB. Here we report the first large-scale metagenomic survey of MTB from both northern and southern hemispheres combined with 28 genomes from uncultivated MTB. These genomes expand greatly the coverage of MTB in the Proteobacteria, Nitrospirae, and Omnitrophica phyla, and provide the first genomic evidence of MTB belonging to the Zetaproteobacteria and “Candidatus Lambdaproteobacteria” classes. The gene content and organization of magnetosome gene clusters, which are physically grouped genes that encode proteins for magnetosome biosynthesis and organization, are more conserved within phylogenetically similar groups than between different taxonomic lineages. Moreover, the phylogenies of core magnetosome proteins form monophyletic clades. Together, these results suggest a common ancient origin of iron-based (Fe3O4 and Fe3S4) magnetotaxis in the domain Bacteria that underwent lineage-specific evolution, shedding new light on the origin and evolution of biomineralization and magnetotaxis, and expanding significantly the phylogenomic representation of MTB

    The Resident Assessment Instrument-Minimum Data Set 2.0 quality indicators: a systematic review

    Get PDF
    BackgroundThe Resident Assessment Instrument-Minimum Data Set (RAI-MDS) 2.0 is designed to collect the minimum amount of data to guide care planning and monitoring for residents in long-term care settings. These data have been used to compute indicators of care quality. Use of the quality indicators to inform quality improvement initiatives is contingent upon the validity and reliability of the indicators. The purpose of this review was to systematically examine published and grey research reports in order to assess the state of the science regarding the validity and reliability of the RAI-MDS 2.0 Quality Indicators (QIs).MethodsWe systematically reviewed the evidence for the validity and reliability of the RAI-MDS 2.0 QIs. A comprehensive literature search identified relevant original research published, in English, prior to December 2008. Fourteen articles and one report examining the validity and/or reliability of the RAI-MDS 2.0 QIs were included.ResultsThe studies fell into two broad categories, those that examined individual quality indicators and those that examined multiple indicators. All studies were conducted in the United States and included from one to a total of 209 facilities. The number of residents included in the studies ranged from 109 to 5758. One study conducted under research conditions examined 38 chronic care QIs, of which strong evidence for the validity of 12 of the QIs was found. In response to these findings, the 12 QIs were recommended for public reporting purposes. However, a number of observational studies (n=13), conducted in &quot;real world&quot; conditions, have tested the validity and/or reliability of individual QIs, with mixed results. Ten QIs have been studied in this manner, including falls, depression, depression without treatment, urinary incontinence, urinary tract infections, weight loss, bedfast, restraint, pressure ulcer, and pain. These studies have revealed the potential for systematic bias in reporting, with under-reporting of some indicators and over-reporting of others.ConclusionEvidence for the reliability and validity of the RAI-MDS QIs remains inconclusive. The QIs provide a useful tool for quality monitoring and to inform quality improvement programs and initiatives. However, caution should be exercised when interpreting the QI results and other sources of evidence of the quality of care processes should be considered in conjunction with QI results.<br /

    Quantifying the effects of temperature and noise on attention-level using EDA and EEG sensors

    Get PDF
    Most people with Autism Spectrum Disorder (ASD) experience atypical sensory modality and need help to self-regulate their sensory responses. Results of a pilot study are presented here where temperature, noise types and noise levels are used as independent variables. Attention-based tests (ABTs), Electrodermal Activity (EDA) and Electroencephalography (EEG) sensors are used as dependent variables to quantify the effects of temperature and noise. Based on the outcome of the analyses, it is feasible to use off-the-shelf sensors to recognize physiological changes, indicating a possibility to develop sensory management recommendation interventions to support people with ASD

    Effects of resistance and functional-skills training on habitual activity and constipation among older adults living in long-term care facilities: a randomized controlled trial

    Get PDF
    BACKGROUND: Large-scale RCTs comparing different types of exercise training in institutionalised older people are scarce, especially regarding effects on habitual physical activity and constipation. This study investigated the effects of different training protocols on habitual physical activity and constipation of older adults living in long-term care facilities. METHODS: A randomized controlled trial with 157 participants, aged 64 to 94 years, who were randomly assigned to 1) resistance training; 2) all-round functional-skills training; 3) both; or 4) an 'educational' control condition. Habitual physical activity was assessed with a physical activity questionnaire and accelerometers. Constipation was assessed by a questionnaire. Measurements were performed at baseline and after six months of training. RESULTS: At baseline the median time spent sitting was 8.2 hr/d, the median time spent on activity of at least moderate intensity was 32 min/d. At baseline, about 22% of the subjects were diagnosed with constipation and 23% were taking laxatives. There were no between-group differences for changes in habitual physical activity or constipation over 6-months. CONCLUSION: Six months of moderate intensity exercise training neither enhances habitual physical activity nor affects complaints of constipation among older people living in long-term care facilities

    Study circles improve the precision in nutritional care in special accommodations

    Get PDF
    Background: Disease-related malnutrition is a major health problem in the elderly population, but it has until recently received very little attention, especially are management issues under-explored. By identifying residents at the risk of undernutrition, appropriate nutritional care can be provided. Objectives: Do study circles and policy documents improve the precision in nutritional care and decrease the prevalence of low or high BMI? Design: Pre and post intervention study. Setting: Special accommodations (nursing homes) within six municipalities were involved. Participants: In 2005, 1726 (90.4%) out of 1910 residents agreed to participate and in 2007, 1526 (81.8%) out of 1866 residents participated. Intervention: Study circles in one municipality, having a policy document in one municipality and no intervention in four municipalities. Measurements: Risk of undernutrition was defined as involving any of: involuntary weight loss, low BMI, and/or eating difficulties. Overweight was defined as high BMI. Results: In 2005 and 2007, 64% of 1726 and 66% of 1526 residents respectively were at the risk of undernutrition. In 2007 significantly more patients in the study circle municipality were accurately provided protein and energy enriched food compared to in the no intervention municipalities. There was a decrease in the prevalence of low BMI in the study circle municipality and the prevalence of overweight increased in the policy document municipality between 2005 and 2007
    corecore