1,665 research outputs found

    Serum procalcitonin for the early recognition of nosocomial infection in the critically ill patients: a preliminary report

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The usefulness of procalcitonin (PCT) measurement in critically ill medical patients with suspected nosocomial infection is unclear. The aim of the study was to assess PCT value for the early diagnosis of bacterial nosocomial infection in selected critically ill patients.</p> <p>Methods</p> <p>An observational cohort study in a 15-bed intensive care unit was performed. Seventy patients with either proven (n = 47) or clinically suspected but not confirmed (n = 23) nosocomial infection were included. Procalcitonin measurements were obtained the day when the infection was suspected (D0) and at least one time within the 3 previous days (D-3 to D0). Patients with proven infection were compared to those without. The diagnostic value of PCT on D0 was determined through the construction of the corresponding receiver operating characteristic (ROC) curve. In addition, the predictive value of PCT variations preceding the clinical suspicion of infection was assessed.</p> <p>Results</p> <p>PCT on D0 was the best predictor of proven infection in this population of ICU patients with a clinical suspicion of infection (AUROCC = 0.80; 95% CI, 0.68–0.91). Thus, a cut-off value of 0.44 ng/mL provides sensitivity and specificity of 65.2% and 83.0%, respectively. Procalcitonin variation between D-1 and D0 was calculated in 45 patients and was also found to be predictive of nosocomial infection (AUROCC = 0.89; 95% CI, 0.79–0.98) with a 100% positive predictive value if the +0.26 ng/mL threshold value was applied. Comparable results were obtained when PCT variation between D-2 and D0, or D-3 and D0 were considered. In contrast, CRP elevation, leukocyte count and fever had a poor predictive value in our population.</p> <p>Conclusion</p> <p>PCT monitoring could be helpful in the early diagnosis of nosocomial infection in the ICU. Both absolute values and variations should be considered and evaluated in further studies.</p

    Role of orexin A signaling in dietary palmitic acid-activated microglial cells

    Get PDF
    AbstractExcess dietary saturated fatty acids such as palmitic acid (PA) induce peripheral and hypothalamic inflammation. Hypothalamic inflammation, mediated in part by microglial activation, contributes to metabolic dysregulation. In rodents, high fat diet-induced microglial activation results in nuclear translocation of nuclear factor-kappa B (NFκB), and increased central pro-inflammatory cytokines tumor necrosis factor alpha (TNF-α) and interleukin-6 (IL-6). The hypothalamic neuropeptide orexin A (OXA, hypocretin 1) is neuroprotective in brain. In cortex, OXA can also reduce inflammation and neurodegeneration through a microglial-mediated pathway. Whether hypothalamic orexin neuroprotection mechanisms depend upon microglia is unknown. To address this issue, we evaluated effects of OXA and PA on inflammatory response in immortalized murine microglial and hypothalamic neuronal cell lines. We demonstrate for the first time in microglial cells that exposure to PA increases gene expression of orexin-1 receptor but not orexin-2 receptor. Pro-inflammatory markers IL-6, TNF-α, and inducible nitric oxide synthase in microglial cells are increased following PA exposure, but are reduced by pretreatment with OXA. The anti-inflammatory marker arginase-1 is increased by OXA. Finally, we show hypothalamic neurons exposed to conditioned media from PA-challenged microglia have increased cell survival only when microglia were pretreated with OXA. These data support the concept that OXA may act as an immunomodulatory regulator of microglia, reducing pro-inflammatory cytokines and increasing anti-inflammatory factors to promote a favorable neuronal microenvironment

    Arduous implementation: Does the Normalisation Process Model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice

    Get PDF
    Background: decision support technologies (DSTs, also known as decision aids) help patients and professionals take part in collaborative decision-making processes. Trials have shown favorable impacts on patient knowledge, satisfaction, decisional conflict and confidence. However, they have not become routinely embedded in health care settings. Few studies have approached this issue using a theoretical framework. We explained problems of implementing DSTs using the Normalization Process Model, a conceptual model that focuses attention on how complex interventions become routinely embedded in practice.Methods: the Normalization Process Model was used as the basis of conceptual analysis of the outcomes of previous primary research and reviews. Using a virtual working environment we applied the model and its main concepts to examine: the 'workability' of DSTs in professional-patient interactions; how DSTs affect knowledge relations between their users; how DSTs impact on users' skills and performance; and the impact of DSTs on the allocation of organizational resources.Results: conceptual analysis using the Normalization Process Model provided insight on implementation problems for DSTs in routine settings. Current research focuses mainly on the interactional workability of these technologies, but factors related to divisions of labor and health care, and the organizational contexts in which DSTs are used, are poorly described and understood.Conclusion: the model successfully provided a framework for helping to identify factors that promote and inhibit the implementation of DSTs in healthcare and gave us insights into factors influencing the introduction of new technologies into contexts where negotiations are characterized by asymmetries of power and knowledge. Future research and development on the deployment of DSTs needs to take a more holistic approach and give emphasis to the structural conditions and social norms in which these technologies are enacte

    Cost-Effective Use of Silver Dressings for the Treatment of Hard-to-Heal Chronic Venous Leg Ulcers

    Get PDF
    Aim To estimate the cost-effectiveness of silver dressings using a health economic model based on time-to-wound-healing in hard-to-heal chronic venous leg ulcers (VLUs). Background Chronic venous ulceration affects 1–3% of the adult population and typically has a protracted course of healing, resulting in considerable costs to the healthcare system. The pathogenesis of VLUs includes excessive and prolonged inflammation which is often related to critical colonisation and early infection. The use of silver dressings to control this bioburden and improve wound healing rates remains controversial. Methods A decision tree was constructed to evaluate the cost-effectiveness of treatment with silver compared with non-silver dressings for four weeks in a primary care setting. The outcomes: ‘Healed ulcer’, ‘Healing ulcer’ or ‘No improvement’ were developed, reflecting the relative reduction in ulcer area from baseline to four weeks of treatment. A data set from a recent meta-analysis, based on four RCTs, was applied to the model. Results Treatment with silver dressings for an initial four weeks was found to give a total cost saving (£141.57) compared with treatment with non-silver dressings. In addition, patients treated with silver dressings had a faster wound closure compared with those who had been treated with non-silver dressings. Conclusion The use of silver dressings improves healing time and can lead to overall cost savings. These results can be used to guide healthcare decision makers in evaluating the economic aspects of treatment with silver dressings in hard-to-heal chronic VLUs

    Attentional bias and treatment adherence in substitute-prescribed opiate users

    Get PDF
    BACKGROUND: Attentional bias (AB) is implicated in the development and maintenance of substance dependence and in treatment outcome. We assessed the effects of attentional bias modification (ABM), and the relationship between AB and treatment adherence in opiate dependent patients. METHOD: An independent groups design was used to compare 23 opiate dependent patients with 21 healthy controls. Participants completed an AB task before either a control or an ABM task designed to train attention away from substance-related stimuli. Pre- and post-ABM AB and craving were assessed to determine any changes. Relationships between treatment adherence (‘using on top’ of prescribed opiates or not) and AB, craving and psychopathology were also examined. RESULTS: There was no baseline difference in AB between patients and controls, and no significant effect of ABM on AB or substance craving. However, treatment adherent patients who did not use illicit opiates on top of their prescribed opiates had statistically significantly greater AB away from substance-related stimuli than both participants using on top and controls, and reported significantly lower levels of craving than non-treatment adherent patients. CONCLUSION: Whilst we did not find any significant effects of ABM on AB or craving, patients who were treatment adherent differed from both those who were not and from controls in their attentional functioning and substance craving. These findings are the first to suggest that AB may be a within-treatment factor predictive of adherence to pharmacological treatment and potentially of recovery in opiate users

    Perspectives of people in Mali toward genetically-modified mosquitoes for malaria control

    Get PDF
    Background: Genetically-modified (GM) mosquitoes have been proposed as part of an integrated vector control strategy for malaria control. Public acceptance is essential prior to field trials, particularly since mosquitoes are a vector of human disease and genetically modified organisms (GMOs) face strong scepticism in developed and developing nations. Despite this, in sub-Saharan Africa, where the GM mosquito effort is primarily directed, very little data is available on perspectives to GMOs. Here, results are presented of a qualitative survey of public attitudes to GM mosquitoes for malaria control in rural and urban areas of Mali, West Africa between the months of October 2008 and June 2009. Methods: The sample consisted of 80 individuals - 30 living in rural communities, 30 living in urban suburbs of Bamako, and 20 Western-trained and traditional health professionals working in Bamako and Bandiagara. Questions were asked about the cause of malaria, heredity and selective breeding. This led to questions about genetic alterations, and acceptable conditions for a release of pest-resistant GM corn and malaria-refractory GM mosquitoes. Finally, participants were asked about the decision-making process in their community. Interviews were transcribed and responses were categorized according to general themes. Results: Most participants cited mosquitoes as one of several causes of malaria. The concept of the gene was not widely understood; however selective breeding was understood, allowing limited communication of the concept of genetic modification. Participants were open to a release of pest-resistant GM corn, often wanting to conduct a trial themselves. The concept of a trial was reapplied to GM mosquitoes, although less frequently. Participants wanted to see evidence that GM mosquitoes can reduce malaria prevalence without negative consequences for human health and the environment. For several participants, a mosquito control programme was preferred; however a transgenic release that satisfied certain requirements was usually acceptable. Conclusions: Although there were some dissenters, the majority of participants were pragmatic towards a release of GM mosquitoes. An array of social and cultural issues associated with malaria, mosquitoes and genetic engineering became apparent. If these can be successfully addressed, then social acceptance among the populations surveyed seems promising

    Identifying and mapping individual plants in a highly diverse high-elevation ecosystem using UAV imagery and deep learning

    Get PDF
    The identification and counting of plant individuals is essential for environmental monitoring. UAV based imagery offer ultra-fine spatial resolution and flexibility in data acquisition, and so provide a great opportunity to enhance current plant and in-situ field surveying. However, accurate mapping of individual plants from UAV imagery remains challenging, given the great variation in the sizes and geometries of individual plants and in their distribution. This is true even for deep learning based semantic segmentation and classification methods. In this research, a novel Scale Sequence Residual U-Net (SS Res U-Net) deep learning method was proposed, which integrates a set of Residual U-Nets with a sequence of input scales that can be derived automatically. The SS Res U-Net classifies individual plants by continuously increasing the patch scale, with features learned at small scales passing gradually to larger scales, thus, achieving multi-scale information fusion while retaining fine spatial details of interest. The SS Res U-Net was tested to identify and map frailejones (all plant species of the subtribe Espeletiinae), the dominant plants in one of the world’s most biodiverse high-elevation ecosystems (i.e. the páramos) from UAV imagery. Results demonstrate that the SS Res U-Net has the ability to self-adapt to variation in objects, and consistently achieved the highest classification accuracy (91.67% on average) compared with four state-of-the-art benchmark approaches. In addition, SS Res U-Net produced the best performances in terms of both robustness to training sample size reduction and computational efficiency compared with the benchmarks. Thus, SS Res U-Net shows great promise for solving remotely sensed semantic segmentation and classification tasks, and more general machine intelligence. The prospective implementation of this method to identify and map frailejones in the páramos will benefit immensely the monitoring of their populations for conservation assessments and management, among many other applications
    corecore