345 research outputs found

    Bupropion for the treatment of fluoxetine non-responsive trichotillomania: a case report

    Get PDF
    <p>Abstract</p> <p>Introduction</p> <p>Trichotillomania, classified as an impulse control disorder in the <it>Diagnostic and Statistical Manual of Mental Disorders</it>, is characterized by the recurrent pulling out of one's hair, resulting in noticeable hair loss. The condition has a varied etiology. Specific serotonin reuptake inhibitors are considered the treatment of choice; however some patients fail to respond to this class of drugs. A few older reports suggest possible benefit from treatment with bupropion.</p> <p>Case presentation</p> <p>A 23-year-old Asian woman with fluoxetine non- responsive trichotillomania was treated with sustained release bupropion (up to 450 mg/day) and cognitive behavior therapy. She demonstrated clinically significant improvement on the Clinical Global Impression - Improvement scale by week 13. The improvement persisted throughout the 12-month follow-up period.</p> <p>Conclusions</p> <p>The present case report may be of interest to psychiatrists and dermatologists. Apart from the serotonergic pathway, others, such as the mesolimbic pathway, also appear to be involved in the causation of trichotillomania. Bupropion may be considered as an alternative pharmacological treatment for patients who do not respond to specific serotonin reuptake inhibitors. However, this initial finding needs to be confirmed by well designed double-blind placebo controlled trials.</p

    Experiments on Multidimensional Solitons

    Full text link
    This article presents an overview of experimental efforts in recent years related to multidimensional solitons in Bose-Einstein condensates. We discuss the techniques used to generate and observe multidimensional nonlinear waves in Bose-Einstein condensates with repulsive interactions. We further summarize observations of planar soliton fronts undergoing the snake instability, the formation of vortex rings, and the emergence of hybrid structures.Comment: review paper, to appear as Chapter 5b in "Emergent Nonlinear Phenomena in Bose-Einstein Condensates: Theory and Experiment," edited by P. G. Kevrekidis, D. J. Frantzeskakis, and R. Carretero-Gonzalez (Springer-Verlag

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Does \u2018bigger\u2019mean \u2018better\u2019? Pitfalls and shortcuts associated with big data for social research

    Get PDF
    \u2018Big data is here to stay.\u2019 This key statement has a double value: is an assumption as well as the reason why a theoretical reflection is needed. Furthermore, Big data is something that is gaining visibility and success in social sciences even, overcoming the division between humanities and computer sciences. In this contribution some considerations on the presence and the certain persistence of Big data as a socio-technical assemblage will be outlined. Therefore, the intriguing opportunities for social research linked to such interaction between practices and technological development will be developed. However, despite a promissory rhetoric, fostered by several scholars since the birth of Big data as a labelled concept, some risks are just around the corner. The claims for the methodological power of bigger and bigger datasets, as well as increasing speed in analysis and data collection, are creating a real hype in social research. Peculiar attention is needed in order to avoid some pitfalls. These risks will be analysed for what concerns the validity of the research results \u2018obtained through Big data. After a pars distruens, this contribution will conclude with a pars construens; assuming the previous critiques, a mixed methods research design approach will be described as a general proposal with the objective of stimulating a debate on the integration of Big data in complex research projecting

    Situational Awareness of Influenza Activity Based on Multiple Streams of Surveillance Data Using Multivariate Dynamic Linear Model

    Get PDF
    BACKGROUND: Multiple sources of influenza surveillance data are becoming more available; however integration of these data streams for situational awareness of influenza activity is less explored. METHODS AND RESULTS: We applied multivariate time-series methods to sentinel outpatient and school absenteeism surveillance data in Hong Kong during 2004-2009. School absenteeism data and outpatient surveillance data experienced interruptions due to school holidays and changes in public health guidelines during the pandemic, including school closures and the establishment of special designated flu clinics, which in turn provided 'drop-in' fever counts surveillance data. A multivariate dynamic linear model was used to monitor influenza activity throughout epidemics based on all available data. The inferred level followed influenza activity closely at different times, while the inferred trend was less competent with low influenza activity. Correlations between inferred level and trend from the multivariate model and reference influenza activity, measured by the product of weekly laboratory influenza detection rates and weekly general practitioner influenza-like illness consultation rates, were calculated and compared with those from univariate models. Over the whole study period, there was a significantly higher correlation (rho = 0.82, p</=0.02) for the inferred trend based on the multivariate model compared to other univariate models, while the inferred trend from the multivariate model performed as well as the best univariate model in the pre-pandemic and the pandemic period. The inferred trend and level from the multivariate model was able to match, if not outperform, the best univariate model albeit with missing data plus drop-in and drop-out of different surveillance data streams. An overall influenza index combining level and trend was constructed to demonstrate another potential use of the method. CONCLUSIONS: Our results demonstrate the potential use of multiple streams of influenza surveillance data to promote situational awareness about the level and trend of seasonal and pandemic influenza activity.published_or_final_versio

    Theory of Multidimensional Solitons

    Full text link
    We review a number of topics germane to higher-dimensional solitons in Bose-Einstein condensates. For dark solitons, we discuss dark band and planar solitons; ring dark solitons and spherical shell solitons; solitary waves in restricted geometries; vortex rings and rarefaction pulses; and multi-component Bose-Einstein condensates. For bright solitons, we discuss instability, stability, and metastability; bright soliton engineering, including pulsed atom lasers; solitons in a thermal bath; soliton-soliton interactions; and bright ring solitons and quantum vortices. A thorough reference list is included.Comment: review paper, to appear as Chapter 5a in "Emergent Nonlinear Phenomena in Bose-Einstein Condensates: Theory and Experiment," edited by P. G. Kevrekidis, D. J. Frantzeskakis, and R. Carretero-Gonzalez (Springer-Verlag

    Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis

    Get PDF
    Background Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy. Methods We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance. Results We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US appears to have a positive yield of 1.3%, with 89% of these being confirmed by venography. Conclusion Combined colour-doppler US techniques have optimal sensitivity, while compression US has optimal specificity for DVT. However, all estimates are subject to substantial unexplained heterogeneity. The role of repeat scanning is very uncertain and based upon limited data

    Methylphenidate Decreased the Amount of Glucose Needed by the Brain to Perform a Cognitive Task

    Get PDF
    The use of stimulants (methylphenidate and amphetamine) as cognitive enhancers by the general public is increasing and is controversial. It is still unclear how they work or why they improve performance in some individuals but impair it in others. To test the hypothesis that stimulants enhance signal to noise ratio of neuronal activity and thereby reduce cerebral activity by increasing efficiency, we measured the effects of methylphenidate on brain glucose utilization in healthy adults. We measured brain glucose metabolism (using Positron Emission Tomography and 2-deoxy-2[18F]fluoro-D-glucose) in 23 healthy adults who were tested at baseline and while performing an accuracy-controlled cognitive task (numerical calculations) given with and without methylphenidate (20 mg, oral). Sixteen subjects underwent a fourth scan with methylphenidate but without cognitive stimulation. Compared to placebo methylphenidate significantly reduced the amount of glucose utilized by the brain when performing the cognitive task but methylphenidate did not affect brain metabolism when given without cognitive stimulation. Whole brain metabolism when the cognitive task was given with placebo increased 21% whereas with methylphenidate it increased 11% (50% less). This reflected both a decrease in magnitude of activation and in the regions activated by the task. Methylphenidate's reduction of the metabolic increases in regions from the default network (implicated in mind-wandering) was associated with improvement in performance only in subjects who activated these regions when the cognitive task was given with placebo. These results corroborate prior findings that stimulant medications reduced the magnitude of regional activation to a task and in addition document a “focusing” of the activation. This effect may be beneficial when neuronal resources are diverted (i.e., mind-wandering) or impaired (i.e., attention deficit hyperactivity disorder), but it could be detrimental when brain activity is already optimally focused. This would explain why methylphenidate has beneficial effects in some individuals and contexts and detrimental effects in others

    Strategies for Treating Latent Multiple-Drug Resistant Tuberculosis: A Decision Analysis

    Get PDF
    BACKGROUND: The optimal treatment for latent multiple-drug resistant tuberculosis infection remains unclear. In anticipation of future clinical trials, we modeled the expected performance of six potential regimens for treatment of latent multiple-drug resistant tuberculosis. METHODS: A computerized Markov model to analyze the total cost of treatment for six different regimens: Pyrazinamide/ethambutol, moxifloxacin monotherapy, moxifloxacin/pyrazinamide, moxifloxacin/ethambutol, moxifloxacin/ethionamide, and moxifloxacin/PA-824. Efficacy estimates were extrapolated from mouse models and examined over a wide range of assumptions. RESULTS: In the base-case, moxifloxacin monotherapy was the lowest cost strategy, but moxifloxacin/ethambutol was cost-effective at an incremental cost-effectiveness ratio of $21,252 per quality-adjusted life-year. Both pyrazinamide-containing regimens were dominated due to their toxicity. A hypothetical regimen of low toxicity and even modest efficacy was cost-effective compared to "no treatment." CONCLUSION: In our model, moxifloxacin/ethambutol was the preferred treatment strategy under a wide range of assumptions; pyrazinamide-containing regimens fared poorly because of high rates of toxicity. Although more data are needed on efficacy of treatments for latent MDR-TB infection, data on toxicity and treatment discontinuation, which are easier to obtain, could have a substantial impact on public health practice
    corecore