308 research outputs found

    Outcome analysis of intracorneal ring segments for the treatment of keratoconus based on visual, refractive, and aberrometric impairment

    Get PDF
    PURPOSE: To analyze the outcomes of intracorneal ring segment (ICRS) implantation for the treatment of keratoconus based on preoperative visual impairment. DESIGN: Multicenter, retrospective, nonrandomized study. METHODS: A total of 611 eyes of 361 keratoconic patients were evaluated. Subjects were classified according to their preoperative corrected distance visual acuity (CDVA) into 5 different groups: grade I, CDVA of 0.90 or better; grade II, CDVA equal to or better than 0.60 and worse than 0.90; grade III, CDVA equal to or better than 0.40 and worse than 0.60; grade IV, CDVA equal to or better than 0.20 and worse than 0.40; and grade plus, CDVA worse than 0.20. Success and failure indices were defined based on visual, refractive, corneal topographic, and aberrometric data and evaluated in each group 6 months after ICRS implantation. RESULTS: Significant improvement after the procedure was observed regarding uncorrected distance visual acuity in all grades (P < .05). CDVA significantly decreased in grade I (P < .01) but significantly increased in all other grades (P < .05). A total of 37.9% of patients with preoperative CDVA 0.6 or better gained 1 or more lines of CDVA, whereas 82.8% of patients with preoperative CDVA 0.4 or worse gained 1 or more lines of CDVA (P < .01). Spherical equivalent and keratometry readings showed a significant reduction in all grades (P ≀ .02). Corneal higher-order aberrations did not change after the procedure (P ≄ .05). CONCLUSIONS: Based on preoperative visual impairment, ICRS implantation provides significantly better results in patients with a severe form of the disease. A notable loss of CDVA lines can be expected in patients with a milder form of keratoconus

    Stress-Induced Reinstatement of Drug Seeking: 20 Years of Progress

    Get PDF
    In human addicts, drug relapse and craving are often provoked by stress. Since 1995, this clinical scenario has been studied using a rat model of stress-induced reinstatement of drug seeking. Here, we first discuss the generality of stress-induced reinstatement to different drugs of abuse, different stressors, and different behavioral procedures. We also discuss neuropharmacological mechanisms, and brain areas and circuits controlling stress-induced reinstatement of drug seeking. We conclude by discussing results from translational human laboratory studies and clinical trials that were inspired by results from rat studies on stress-induced reinstatement. Our main conclusions are (1) The phenomenon of stress-induced reinstatement, first shown with an intermittent footshock stressor in rats trained to self-administer heroin, generalizes to other abused drugs, including cocaine, methamphetamine, nicotine, and alcohol, and is also observed in the conditioned place preference model in rats and mice. This phenomenon, however, is stressor specific and not all stressors induce reinstatement of drug seeking. (2) Neuropharmacological studies indicate the involvement of corticotropin-releasing factor (CRF), noradrenaline, dopamine, glutamate, kappa/dynorphin, and several other peptide and neurotransmitter systems in stress-induced reinstatement. Neuropharmacology and circuitry studies indicate the involvement of CRF and noradrenaline transmission in bed nucleus of stria terminalis and central amygdala, and dopamine, CRF, kappa/dynorphin, and glutamate transmission in other components of the mesocorticolimbic dopamine system (ventral tegmental area, medial prefrontal cortex, orbitofrontal cortex, and nucleus accumbens). (3) Translational human laboratory studies and a recent clinical trial study show the efficacy of alpha-2 adrenoceptor agonists in decreasing stress-induced drug craving and stress-induced initial heroin lapse

    Effectiveness and economic analysis of the whole cell/recombinant B subunit (WC/rbs) inactivated oral cholera vaccine in the prevention of traveller's diarrhoea

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Nowadays there is a debate about the indication of the oral whole-cell/recombinant B-subunit cholera vaccine (WC/rBS) in traveller's diarrhoea. However, a cost-benefit analysis based on real data has not been published.</p> <p>Methods</p> <p>A cost-effectiveness and cost-benefit study of the oral cholera vaccine (WC/rBS), Dukoral<sup>¼ </sup>for the prevention of traveller's diarrhoea (TD) was performed in subjects travelling to cholera risk areas. The effectiveness of WC/rBS vaccine in the prevention of TD was analyzed in 362 travellers attending two International Vaccination Centres in Spain between May and September 2005.</p> <p>Results</p> <p>The overall vaccine efficacy against TD was 42,6%. Direct healthcare-related costs as well as indirect costs (lost vacation days) subsequent to the disease were considered. Preventive vaccination against TD resulted in a mean saving of 79.26 € per traveller.</p> <p>Conclusion</p> <p>According to the cost-benefit analysis performed, the recommendation for WC/rBS vaccination in subjects travelling to zones at risk of TD is beneficial for the traveller, regardless of trip duration and visited continent.</p

    Can computerized clinical decision support systems improve practitioners' diagnostic test ordering behavior? A decision-maker-researcher partnership systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Underuse and overuse of diagnostic tests have important implications for health outcomes and costs. Decision support technology purports to optimize the use of diagnostic tests in clinical practice. The objective of this review was to assess whether computerized clinical decision support systems (CCDSSs) are effective at improving ordering of tests for diagnosis, monitoring of disease, or monitoring of treatment. The outcome of interest was effect on the diagnostic test-ordering behavior of practitioners.</p> <p>Methods</p> <p>We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for eligible articles published up to January 2010. We included randomized controlled trials comparing the use of CCDSSs to usual practice or non-CCDSS controls in clinical care settings. Trials were eligible if at least one component of the CCDSS gave suggestions for ordering or performing a diagnostic procedure. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of test ordering outcomes.</p> <p>Results</p> <p>Thirty-five studies were identified, with significantly higher methodological quality in those published after the year 2000 (<it>p </it>= 0.002). Thirty-three trials reported evaluable data on diagnostic test ordering, and 55% (18/33) of CCDSSs improved testing behavior overall, including 83% (5/6) for diagnosis, 63% (5/8) for treatment monitoring, 35% (6/17) for disease monitoring, and 100% (3/3) for other purposes. Four of the systems explicitly attempted to reduce test ordering rates and all succeeded. Factors of particular interest to decision makers include costs, user satisfaction, and impact on workflow but were rarely investigated or reported.</p> <p>Conclusions</p> <p>Some CCDSSs can modify practitioner test-ordering behavior. To better inform development and implementation efforts, studies should describe in more detail potentially important factors such as system design, user interface, local context, implementation strategy, and evaluate impact on user satisfaction and workflow, costs, and unintended consequences.</p

    Acute Pain and a Motivational Pathway in Adult Rats: Influence of Early Life Pain Experience

    Get PDF
    The importance of neonatal experience upon behaviour in later life is increasingly recognised. The overlap between pain and reward pathways led us to hypothesise that neonatal pain experience influences reward-related pathways and behaviours in adulthood

    Re-evaluation of blood mercury, lead and cadmium concentrations in the Inuit population of Nunavik (Québec): a cross-sectional study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Arctic populations are exposed to mercury, lead and cadmium through their traditional diet. Studies have however shown that cadmium exposure is most often attributable to tobacco smoking. The aim of this study is to examine the trends in mercury, lead and cadmium exposure between 1992 and 2004 in the Inuit population of Nunavik (Northern QuĂ©bec, Canada) using the data obtained from two broad scale health surveys, and to identify sources of exposure in 2004.</p> <p>Methods</p> <p>In 2004, 917 adults aged between 18 and 74 were recruited in the 14 communities of Nunavik to participate to a broad scale health survey. Blood samples were collected and analysed for metals by inductively coupled plasma mass spectrometry, and dietary and life-style characteristics were documented by questionnaires. Results were compared with data obtained in 1992, where 492 people were recruited for a similar survey in the same population.</p> <p>Results</p> <p>Mean blood concentration of mercury was 51.2 nmol/L, which represent a 32% decrease (p < 0.001) between 1992 and 2004. Mercury blood concentrations were mainly explained by age (partial r<sup>2 </sup>= 0.20; p < 0.0001), and the most important source of exposure to mercury was marine mammal meat consumption (partial r<sup>2 </sup>= 0.04; p < 0.0001). In 2004, mean blood concentration of lead was 0.19 ÎŒmol/L and showed a 55% decrease since 1992. No strong associations were observed with any dietary source, and lead concentrations were mainly explained by age (partial r<sup>2 </sup>= 0.20.; p < 0.001). Blood cadmium concentrations showed a 22% decrease (p < 0.001) between 1992 and 2004. Once stratified according to tobacco use, means varied between 5.3 nmol/L in never-smokers and 40.4 nmol/L in smokers. Blood cadmium concentrations were mainly associated with tobacco smoking (partial r<sup>2 </sup>= 0.56; p < 0.0001), while consumption of caribou liver and kidney remain a minor source of cadmium exposure among never-smokers.</p> <p>Conclusion</p> <p>Important decreases in mercury, lead and cadmium exposure were observed. Mercury decrease could be explained by dietary changes and the ban of lead cartridges use likely contributed to the decrease in lead exposure. Blood cadmium concentrations remain high and, underscoring the need for intensive tobacco smoking prevention campaigns in the Nunavik population.</p

    Computerized clinical decision support systems for chronic disease management: A decision-maker-researcher partnership systematic review

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The use of computerized clinical decision support systems (CCDSSs) may improve chronic disease management, which requires recurrent visits to multiple health professionals, ongoing disease and treatment monitoring, and patient behavior modification. The objective of this review was to determine if CCDSSs improve the processes of chronic care (such as diagnosis, treatment, and monitoring of disease) and associated patient outcomes (such as effects on biomarkers and clinical exacerbations).</p> <p>Methods</p> <p>We conducted a decision-maker-researcher partnership systematic review. We searched MEDLINE, EMBASE, Ovid's EBM Reviews database, Inspec, and reference lists for potentially eligible articles published up to January 2010. We included randomized controlled trials that compared the use of CCDSSs to usual practice or non-CCDSS controls. Trials were eligible if at least one component of the CCDSS was designed to support chronic disease management. We considered studies 'positive' if they showed a statistically significant improvement in at least 50% of relevant outcomes.</p> <p>Results</p> <p>Of 55 included trials, 87% (n = 48) measured system impact on the process of care and 52% (n = 25) of those demonstrated statistically significant improvements. Sixty-five percent (36/55) of trials measured impact on, typically, non-major (surrogate) patient outcomes, and 31% (n = 11) of those demonstrated benefits. Factors of interest to decision makers, such as cost, user satisfaction, system interface and feature sets, unique design and deployment characteristics, and effects on user workflow were rarely investigated or reported.</p> <p>Conclusions</p> <p>A small majority (just over half) of CCDSSs improved care processes in chronic disease management and some improved patient health. Policy makers, healthcare administrators, and practitioners should be aware that the evidence of CCDSS effectiveness is limited, especially with respect to the small number and size of studies measuring patient outcomes.</p
    • 

    corecore