187 research outputs found

    Mobility and other predictors of hospitalization for urinary tract infection: a retrospective cohort study

    Get PDF
    Abstract Background Many hospitalizations for residents of skilled nursing facilities are potentially avoidable. Factors that could prevent hospitalization for urinary tract infection (UTI) were investigated, with focus on patient mobility. Methods A retrospective cohort study was conducted using 2003–2004 data from the Centers for Medicare and Medicaid Services. The study included 408,192 residents of 4267 skilled nursing facilities in California, Florida, Michigan, New York, and Texas. The patients were followed over time, from admission to the skilled nursing facility to discharge or, for those who were not discharged, for 1 year. Cox proportional hazards regression was conducted with hospitalization for UTI as the outcome. Results The ability to walk was associated with a 69% lower rate of hospitalization for UTI. Maintaining or improving walking ability over time reduced the risk of hospitalization for UTI by 39% to 76% for patients with various conditions. For residents with severe mobility problems, such as being in a wheelchair or having a missing limb, maintaining or improving mobility (in bed or when transferring) reduced the risk of hospitalization for UTI by 38% to 80%. Other potentially modifiable predictors included a physician visit at the time of admission to the skilled nursing facility (Hazard Ratio (HR), 0.68), use of an indwelling urinary catheter (HR, 2.78), infection with Clostridium difficile or an antibiotic-resistant microorganism (HR, 1.20), and use of 10 or more medications (HR, 1.31). Patient characteristics associated with hospitalization for UTI were advancing age, being Hispanic or African-American, and having diabetes mellitus, renal failure, Parkinson's disease, dementia, or stroke. Conclusion Maintaining or improving mobility (walking, transferring between positions, or moving in bed) was associated with a lower risk of hospitalization for UTI. A physician visit at the time of admission to the skilled nursing facility also reduced the risk of hospitalization for UTI.http://deepblue.lib.umich.edu/bitstream/2027.42/112369/1/12877_2008_Article_125.pd

    Coral Colonisation of an Artificial Reef in a Turbid Nearshore Environment, Dampier Harbour, Western Australia

    Get PDF
    A 0.6 hectare artificial reef of local rock and recycled concrete sleepers was constructed in December 2006 at Parker Point in the industrial port of Dampier, western Australia, with the aim of providing an environmental offset for a nearshore coral community lost to land reclamation. Corals successfully colonised the artificial reef, despite the relatively harsh environmental conditions at the site (annual water temperature range 18-32°C, intermittent high turbidity, frequent cyclones, frequent nearby ship movements). Coral settlement to the artificial reef was examined by terracotta tile deployments, and later stages of coral community development were examined by in-situ visual surveys within fixed 25 x 25 cm quadrats on the rock and concrete substrates. Mean coral density on the tiles varied from 113 ± 17 SE to 909 ± 85 SE per m2 over five deployments, whereas mean coral density in the quadrats was only 6.0 ± 1.0 SE per m2 at eight months post construction, increasing to 24.0 ± 2.1 SE per m2 at 62 months post construction. Coral taxa colonising the artificial reef were a subset of those on the surrounding natural reef, but occurred in different proportions-Pseudosiderastrea tayami, Mycedium elephantotus and Leptastrea purpurea being disproportionately abundant on the artificial reef. Coral cover increased rapidly in the later stages of the study, reaching 2.3 ± 0.7 SE % at 62 months post construction. This study indicates that simple materials of opportunity can provide a suitable substrate for coral recruitment in Dampier Harbour, and that natural colonisation at the study site remains sufficient to initiate a coral community on artificial substrate despite ongoing natural and anthropogenic perturbations. © 2013 Blakeway et al

    Dietary Fiber and Saturated Fat Intake Associations with Cardiovascular Disease Differ by Sex in the Malmö Diet and Cancer Cohort: A Prospective Study

    Get PDF
    BACKGROUND: The aim of the study was to examine associations between intake of macronutrients and dietary fiber and incident ischemic cardiovascular disease (iCVD) in men and women. METHODS: We used data from 8,139 male and 12,535 female participants (aged 44-73 y) of the Swedish population-based Malmö Diet and Cancer cohort. The participants were without history of CVD and diabetes mellitus, and had reported stable dietary habits in the study questionnaire. Diet was assessed by a validated modified diet history method, combining a 7-d registration of cooked meals and cold beverages, a 168-item food questionnaire (covering other foods and meal patterns), and a 1-hour diet interview. Sociodemographic and lifestyle data were collected by questionnaire. iCVD cases, which included coronary events (myocardial infarctions or deaths from chronic ischemic heart disease) and ischemic strokes, were ascertained via national and local registries. Nutrient-disease associations were examined by multivariate Cox regressions. RESULTS: During a mean follow-up of 13.5 years, we identified 1,089 male and 687 female iCVD cases. High fiber intakes were associated with lower incidence rates of iCVD in women and of ischemic stroke in men. In post-hoc analysis, we discovered statistically significant interactions between intake of fiber and saturated fat; these interactions also differed between men and women (p<0.001). CONCLUSIONS: In this well-defined population, a high fiber intake was associated with lower risk of iCVD, but there were no robust associations between other macronutrients and iCVD risk. Judging from this study, gender-specific nutrient analysis may be preferable in epidemiology

    Monoterpene Variation Mediated Attack Preference Evolution of the Bark Beetle Dendroctonus valens

    Get PDF
    Several studies suggest that some bark beetle like to attack large trees. The invasive red turpentine beetle (RTB), Dendroctonus valens LeConte, one of the most destructive forest pests in China, is known to exhibit this behavior. Our previous study demonstrated that RTBs preferred to attack large-diameter trees (diameter at breast height, DBH ≥30 cm) over small-diameter trees (DBH ≤10 cm) in the field. In the current study, we studied the attacking behavior and the underlying mechanisms in the laboratory. Behavioral assays showed that RTBs preferred the bark of large-DBH trees and had a higher attack rate on the bolts of these trees. Y-tube assays showed that RTBs preferred the volatiles released by large-DBH trees to those released by small-DBH trees. Subsequent analysis revealed that both large- and small-DBH trees had the same composition of monoterpenes, but the concentration of each component differed; thus it appeared that the concentrations acted as cues for RTBs to locate the right-sized host which was confirmed by further behavioral assays. Moreover, large-DBH pine trees provided more spacious habitat and contained more nutrients, such as nitrogen, than did small-DBH pine trees, which benefited RTBs' fecundity and larval development. RTBs seem to have evolved mechanisms to locate those large hosts that will allow them to maximize their fitness. Monoterpene variation mediated attack preference implies the potential for the management of RTB

    Development and evaluation of an instrument for the critical appraisal of randomized controlled trials of natural products

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The efficacy of natural products (NPs) is being evaluated using randomized controlled trials (RCTs) with increasing frequency, yet a search of the literature did not identify a widely accepted critical appraisal instrument developed specifically for use with NPs. The purpose of this project was to develop and evaluate a critical appraisal instrument that is sufficiently rigorous to be used in evaluating RCTs of conventional medicines, and also has a section specific for use with single entity NPs, including herbs and natural sourced chemicals.</p> <p>Methods</p> <p>Three phases of the project included: 1) using experts and a Delphi process to reach consensus on a list of items essential in describing the identity of an NP; 2) compiling a list of non-NP items important for evaluating the quality of an RCT using systematic review methodology to identify published instruments and then compiling item categories that were part of a validated instrument and/or had empirical evidence to support their inclusion and 3) conducting a field test to compare the new instrument to a published instrument for usefulness in evaluating the quality of 3 RCTs of a NP and in applying results to practice.</p> <p>Results</p> <p>Two Delphi rounds resulted in a list of 15 items essential in describing NPs. Seventeen item categories fitting inclusion criteria were identified from published instruments for conventional medicines. The new assessment instrument was assembled based on content of the two lists and the addition of a Reviewer's Conclusion section. The field test of the new instrument showed good criterion validity. Participants found it useful in translating evidence from RCTs to practice.</p> <p>Conclusion</p> <p>A new instrument for the critical appraisal of RCTs of NPs was developed and tested. The instrument is distinct from other available assessment instruments for RCTs of NPs in its systematic development and validation. The instrument is ready to be used by pharmacy students, health care practitioners and academics and will continue to be refined as required.</p

    Quality of life data as prognostic indicators of survival in cancer patients: an overview of the literature from 1982 to 2008

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health-related quality of life and survival are two important outcome measures in cancer research and practice. The aim of this paper is to examine the relationship between quality of life data and survival time in cancer patients.</p> <p>Methods</p> <p>A review was undertaken of all the full publications in the English language biomedical journals between 1982 and 2008. The search was limited to cancer, and included the combination of keywords 'quality of life', 'patient reported-outcomes' 'prognostic', 'predictor', 'predictive' and 'survival' that appeared in the titles of the publications. In addition, each study was examined to ensure that it used multivariate analysis. Purely psychological studies were excluded. A manual search was also performed to include additional papers of potential interest.</p> <p>Results</p> <p>A total of 451 citations were identified in this rapid and systematic review of the literature. Of these, 104 citations on the relationship between quality of life and survival were found to be relevant and were further examined. The findings are summarized under different headings: heterogeneous samples of cancer patients, lung cancer, breast cancer, gastro-oesophageal cancers, colorectal cancer, head and neck cancer, melanoma and other cancers. With few exceptions, the findings showed that quality of life data or some aspects of quality of life measures were significant independent predictors of survival duration. Global quality of life, functioning domains and symptom scores - such as appetite loss, fatigue and pain - were the most important indicators, individually or in combination, for predicting survival times in cancer patients after adjusting for one or more demographic and known clinical prognostic factors.</p> <p>Conclusion</p> <p>This review provides evidence for a positive relationship between quality of life data or some quality of life measures and the survival duration of cancer patients. Pre-treatment (baseline) quality of life data appeared to provide the most reliable information for helping clinicians to establish prognostic criteria for treating their cancer patients. It is recommended that future studies should use valid instruments, apply sound methodological approaches and adequate multivariate statistical analyses adjusted for socio-demographic characteristics and known clinical prognostic factors with a satisfactory validation strategy. This strategy is likely to yield more accurate and specific quality of life-related prognostic variables for specific cancers.</p

    A comprehensive overview of radioguided surgery using gamma detection probe technology

    Get PDF
    The concept of radioguided surgery, which was first developed some 60 years ago, involves the use of a radiation detection probe system for the intraoperative detection of radionuclides. The use of gamma detection probe technology in radioguided surgery has tremendously expanded and has evolved into what is now considered an established discipline within the practice of surgery, revolutionizing the surgical management of many malignancies, including breast cancer, melanoma, and colorectal cancer, as well as the surgical management of parathyroid disease. The impact of radioguided surgery on the surgical management of cancer patients includes providing vital and real-time information to the surgeon regarding the location and extent of disease, as well as regarding the assessment of surgical resection margins. Additionally, it has allowed the surgeon to minimize the surgical invasiveness of many diagnostic and therapeutic procedures, while still maintaining maximum benefit to the cancer patient. In the current review, we have attempted to comprehensively evaluate the history, technical aspects, and clinical applications of radioguided surgery using gamma detection probe technology

    Does optimal foraging theory explain why suburban Florida scrub-jays (Aphelocoma coerulescens) feed their young human-provided food?

    Full text link
    Optimal foraging theory assumes that a forager can adequately assess the quality of its prey and predicts that parents feed their young low-quality foods only when suffering unpredicted reductions in their ability to provision. Wildland Florida scrub-jays feed their young exclusively arthropods, but suburban parents include human-provided foods in the nestling diet, with possible costs in terms of reduced growth and survival. We tested experimentally whether parents feed human-provided foods, given the apparent costs, because: 1) they do not discriminate between food types, 2) they switch to low-quality, abundant foods when natural food availability in the environment is low, or 3) they switch when the time needed to obtain natural food is high. Parents discriminated between natural and human-provided foods by showing a preference for natural foods when rearing young. When the handling time of natural foods was increased experimentally, parents in the suburban and wildland habitats switched to human-provided foods. Supplementation with natural foods increased preference for this food in both habitats. Suburban parents chose more natural foods than wildland parents, suggesting that they have a greater preference for natural foods. Regardless of preferences demonstrated at feeders, parents in both the suburbs and wildlands delivered mostly natural foods to nestlings, independent of natural food availability. Nonetheless, natural foods are likely to be scarcer in the environment than in our experimental tests. Because natural food availability is lower in the suburbs than in the wildland habitat, parents in the suburbs may be forced to switch to human-provided foods when feeding nestlings

    Integrating precision cancer medicine into healthcare—policy, practice, and research challenges

    Full text link
    Abstract Precision medicine (PM) can be defined as a predictive, preventive, personalized, and participatory healthcare service delivery model. Recent developments in molecular biology and information technology make PM a reality today through the use of massive amounts of genetic, ‘omics’, clinical, environmental, and lifestyle data. With cancer being one of the most prominent public health threats in developed countries, both the research community and governments have been investing significant time, money, and efforts in precision cancer medicine (PCM). Although PCM research is extremely promising, a number of hurdles still remain on the road to an optimal integration of standardized and evidence-based use of PCM in healthcare systems. Indeed, PCM raises a number of technical, organizational, ethical, legal, social, and economic challenges that have to be taken into account in the development of an appropriate health policy framework. Here, we highlight some of the more salient issues regarding the standards needed for integration of PCM into healthcare systems, and we identify fields where more research is needed before policy can be implemented. Key challenges include, but are not limited to, the creation of new standards for the collection, analysis, and sharing of samples and data from cancer patients, and the creation of new clinical trial designs with renewed endpoints. We believe that these issues need to be addressed as a matter of priority by public health policymakers in the coming years for a better integration of PCM into healthcare
    • …
    corecore