1,914 research outputs found

    Cerebellar Integrity in the Amyotrophic Lateral Sclerosis - Frontotemporal Dementia Continuum

    Get PDF
    Amyotrophic lateral sclerosis (ALS) and behavioural variant frontotemporal dementia (bvFTD) are multisystem neurodegenerative disorders that manifest overlapping cognitive, neuropsychiatric and motor features. The cerebellum has long been known to be crucial for intact motor function although emerging evidence over the past decade has attributed cognitive and neuropsychiatric processes to this structure. The current study set out i) to establish the integrity of cerebellar subregions in the amyotrophic lateral sclerosis-behavioural variant frontotemporal dementia spectrum (ALS-bvFTD) and ii) determine whether specific cerebellar atrophy regions are associated with cognitive, neuropsychiatric and motor symptoms in the patients. Seventy-eight patients diagnosed with ALS, ALS-bvFTD, behavioural variant frontotemporal dementia (bvFTD), most without C9ORF72 gene abnormalities, and healthy controls were investigated. Participants underwent cognitive, neuropsychiatric and functional evaluation as well as structural imaging using voxel-based morphometry (VBM) to examine the grey matter subregions of the cerebellar lobules, vermis and crus. VBM analyses revealed: i) significant grey matter atrophy in the cerebellum across the whole ALS-bvFTD continuum; ii) atrophy predominantly of the superior cerebellum and crus in bvFTD patients, atrophy of the inferior cerebellum and vermis in ALS patients, while ALS-bvFTD patients had both patterns of atrophy. Post-hoc covariance analyses revealed that cognitive and neuropsychiatric symptoms were particularly associated with atrophy of the crus and superior lobule, while motor symptoms were more associated with atrophy of the inferior lobules. Taken together, these findings indicate an important role of the cerebellum in the ALS-bvFTD disease spectrum, with all three clinical phenotypes demonstrating specific patterns of subregional atrophy that associated with different symptomology

    Treatment of hemangiomas in children using a Nd:YAG laser in conjunction with ice cooling of the epidermis: techniques and results

    Get PDF
    BACKGROUND: Hemangiomas are the most common type of congenital anomaly in childhood. Although many resolve spontaneously, intervention is required when their growth could damage vital adjacent structures. Various therapeutic approaches to childhood hemangiomas with different types of laser have been described previously. The objective of this study was to determine whether the cooling of the epidermis during irradiation of hemangiomas with a Nd:YAG laser prevents thermal damage and decreases the number of sessions required to treat these lesions. METHODS: Between 1993 and 2001, 110 patients aged 3 months to 4 years, with cutaneous hemangiomas were treated with a Nd:YAG laser. The lesion was cooled with ice prior to, during, and after the irradiation. During each session the laser beam passed through the pieces of ice. The laser power was between 35–45 W with a pulse length of 2–10 seconds. RESULTS: After 6 months of follow-up, from the first session of laser treatment, total resolution was obtained in 72 (65.5%) patients. A second or third session followed in 30 out of 38 patients in which, the initial results were good, moderate, or poor. The parents of the remaining eight children refused this second session and these patients excluded from the study Complications were seen in nine (8.8%) patients. One patient had postoperative bleeding which stopped spontaneously, while atrophic scars occurred in six (5.8%) patients, and hypertrophic scars in two (1.9%) patients. CONCLUSIONS: Nd:YAG laser irradiation in conjunction with ice protection of the epidermis produces good cosmetic results for the treatment of cutaneous hemangiomas in children, and decreases the number of sessions for treatment of these lesions

    Getting it right when budgets are tight: Using optimal expansion pathways to prioritize responses to concentrated and mixed HIV epidemics.

    Get PDF
    BACKGROUND: Prioritizing investments across health interventions is complicated by the nonlinear relationship between intervention coverage and epidemiological outcomes. It can be difficult for countries to know which interventions to prioritize for greatest epidemiological impact, particularly when budgets are uncertain. METHODS: We examined four case studies of HIV epidemics in diverse settings, each with different characteristics. These case studies were based on public data available for Belarus, Peru, Togo, and Myanmar. The Optima HIV model and software package was used to estimate the optimal distribution of resources across interventions associated with a range of budget envelopes. We constructed "investment staircases", a useful tool for understanding investment priorities. These were used to estimate the best attainable cost-effectiveness of the response at each investment level. FINDINGS: We find that when budgets are very limited, the optimal HIV response consists of a smaller number of 'core' interventions. As budgets increase, those core interventions should first be scaled up, and then new interventions introduced. We estimate that the cost-effectiveness of HIV programming decreases as investment levels increase, but that the overall cost-effectiveness remains below GDP per capita. SIGNIFICANCE: It is important for HIV programming to respond effectively to the overall level of funding availability. The analytic tools presented here can help to guide program planners understand the most cost-effective HIV responses and plan for an uncertain future

    Comparative analysis of the lambda-interferons IL-28A and IL-29 regarding their transcriptome and their antiviral properties against hepatitis C virus.

    Get PDF
    Specific differences in signaling and antiviral properties between the different Lambda-interferons, a novel group of interferons composed of IL-28A, IL-28B and IL-29, are currently unknown. This is the first study comparatively investigating the transcriptome and the antiviral properties of the Lambda-interferons IL-28A and IL-29. Expression studies were performed by microarray analysis, quantitative PCR (qPCR), reporter gene assays and immunoluminometric assays. Signaling was analyzed by Western blot. HCV replication was measured in Huh-7 cells expressing subgenomic HCV replicon. All hepatic cell lines investigated as well as primary hepatocytes expressed both IFN-λ receptor subunits IL-10R2 and IFN-λR1. Both, IL-28A and IL-29 activated STAT1 signaling. As revealed by microarray analysis, similar genes were induced by both cytokines in Huh-7 cells (IL-28A: 117 genes; IL-29: 111 genes), many of them playing a role in antiviral immunity. However, only IL-28A was able to significantly down-regulate gene expression (n = 272 down-regulated genes). Both cytokines significantly decreased HCV replication in Huh-7 cells. In comparison to liver biopsies of patients with non-viral liver disease, liver biopsies of patients with HCV showed significantly increased mRNA expression of IL-28A and IL-29. Moreover, IL-28A serum protein levels were elevated in HCV patients. In a murine model of viral hepatitis, IL-28 expression was significantly increased. IL-28A and IL-29 are up-regulated in HCV patients and are similarly effective in inducing antiviral genes and inhibiting HCV replication. In contrast to IL-29, IL-28A is a potent gene repressor. Both IFN-λs may have therapeutic potential in the treatment of chronic HCV

    Strong Host-Feeding Preferences of the Vector Triatoma infestans Modified by Vector Density: Implications for the Epidemiology of Chagas Disease

    Get PDF
    Chagas disease is a complex zoonosis with more than 150 mammalian host species, nearly a dozen blood-sucking triatomine species as main vectors, and 9–11 million people infected with Trypanosoma cruzi (its causal agent) in the Americas. Triatoma infestans, a highly domesticated species and one of the main vectors, feeds more often on domestic animals than on humans in northern Argentina. The question of whether there are host-feeding preferences among dogs, cats, and chickens is crucial to estimating transmission risks and predicting the effects of control tactics targeting them. This article reports the first host choice experiments of triatomine bugs conducted in small huts under natural conditions. The results demonstrate that T. infestans consistently preferred dogs to chickens or cats, with host shifts occurring more frequently at higher vector densities. Combined with earlier findings showing that dogs have high infection rates, are highly infectious, and have high contact rates with humans and domestic bugs, our results reinforce the role of dogs as the key reservoirs of T. cruzi. The strong bug preference for dogs can be exploited to target dogs with topical lotions or insecticide-impregnated collars to turn them into baited lethal traps or use them as transmission or infestation sentinels

    Anticancer Gene Transfer for Cancer Gene Therapy

    Get PDF
    Gene therapy vectors are among the treatments currently used to treat malignant tumors. Gene therapy vectors use a specific therapeutic transgene that causes death in cancer cells. In early attempts at gene therapy, therapeutic transgenes were driven by non-specific vectors which induced toxicity to normal cells in addition to the cancer cells. Recently, novel cancer specific viral vectors have been developed that target cancer cells leaving normal cells unharmed. Here we review such cancer specific gene therapy systems currently used in the treatment of cancer and discuss the major challenges and future directions in this field

    The Brain Matures with Stronger Functional Connectivity and Decreased Randomness of Its Network

    Get PDF
    We investigated the development of the brain's functional connectivity throughout the life span (ages 5 through 71 years) by measuring EEG activity in a large population-based sample. Connectivity was established with Synchronization Likelihood. Relative randomness of the connectivity patterns was established with Watts and Strogatz' (1998) graph parameters C (local clustering) and L (global path length) for alpha (∼10 Hz), beta (∼20 Hz), and theta (∼4 Hz) oscillation networks. From childhood to adolescence large increases in connectivity in alpha, theta and beta frequency bands were found that continued at a slower pace into adulthood (peaking at ∼50 yrs). Connectivity changes were accompanied by increases in L and C reflecting decreases in network randomness or increased order (peak levels reached at ∼18 yrs). Older age (55+) was associated with weakened connectivity. Semi-automatically segmented T1 weighted MRI images of 104 young adults revealed that connectivity was significantly correlated to cerebral white matter volume (alpha oscillations: r = 33, p<01; theta: r = 22, p<05), while path length was related to both white matter (alpha: max. r = 38, p<001) and gray matter (alpha: max. r = 36, p<001; theta: max. r = 36, p<001) volumes. In conclusion, EEG connectivity and graph theoretical network analysis may be used to trace structural and functional development of the brain

    How should HIV resources be allocated? Lessons learnt from applying Optima HIV in 23 countries.

    Full text link
    INTRODUCTION: With limited funds available, meeting global health targets requires countries to both mobilize and prioritize their health spending. Within this context, countries have recognized the importance of allocating funds for HIV as efficiently as possible to maximize impact. Over the past six years, the governments of 23 countries in Africa, Asia, Eastern Europe and Latin America have used the Optima HIV tool to estimate the optimal allocation of HIV resources. METHODS: Each study commenced with a request by the national government for technical assistance in conducting an HIV allocative efficiency study using Optima HIV. Each study team validated the required data, calibrated the Optima HIV epidemic model to produce HIV epidemic projections, agreed on cost functions for interventions, and used the model to calculate the optimal allocation of available funds to best address national strategic plan targets. From a review and analysis of these 23 country studies, we extract common themes around the optimal allocation of HIV funding in different epidemiological contexts. RESULTS AND DISCUSSION: The optimal distribution of HIV resources depends on the amount of funding available and the characteristics of each country's epidemic, response and targets. Universally, the modelling results indicated that scaling up treatment coverage is an efficient use of resources. There is scope for efficiency gains by targeting the HIV response towards the populations and geographical regions where HIV incidence is highest. Across a range of countries, the model results indicate that a more efficient allocation of HIV resources could reduce cumulative new HIV infections by an average of 18% over the years to 2020 and 25% over the years to 2030, along with an approximately 25% reduction in deaths for both timelines. However, in most countries this would still not be sufficient to meet the targets of the national strategic plan, with modelling results indicating that budget increases of up to 185% would be required. CONCLUSIONS: Greater epidemiological impact would be possible through better targeting of existing resources, but additional resources would still be required to meet targets. Allocative efficiency models have proven valuable in improving the HIV planning and budgeting process
    corecore