108 research outputs found

    potential biomarkers of haemophilic arthropathy correlations with compatible additive magnetic resonance imaging scores

    Get PDF
    Introduction: Although biomarkers are useful diagnostic tools to assess joint damage in osteoarthritis and rheumatoid arthritis, few data exist for biomarkers of haemophilic arthropathy. Aim: To evaluate the association between biomarkers and compatible additive magnetic resonance imaging (MRI) scores in patients with severe haemophilia A. Methods: Patients aged 12–35 years with no history of factor VIII (FVIII) inhibitors were enrolled in a controlled, cross-sectional, multinational investigation. Patients received primary or secondary prophylaxis or on-demand treatment with FVIII and underwent MRI on four joints (two ankles, two knees). Soluble biomarkers of cartilage and bone degradation, inflammation, and angiogenesis were assessed (serum levels of C-terminal telopeptides of type I collagen [CTX-I], cartilage oligomeric matrix protein [COMP], chondroitin-sulphate aggrecan turnover 846 epitope [CS846], tissue inhibitor of metalloproteinase 1 [TIMP-1]; plasma levels of vascular endothelial growth factor [VEGF], matrix metalloproteinases 3 and 9 [MMP3, MMP9]). Relationships between biomarkers and MRI scores were evaluated using Spearman rank correlation. Results: Biomarkers were assessed in 117 of 118 per-protocol patients. Mean and median CTX-I, COMP, TIMP-1, MMP3, MMP9, and VEGF values were within normal ranges (reference range not available for CS846 in healthy volunteers). No correlations between biomarkers and MRI scores were found, with the exception of CS846, which showed significant correlation in a subgroup of 22 on-demand patients (r = 0.436; P = 0.04). Conclusions: Compatible additive MRI scores showed no clear correlations with any of the potential biomarkers for haemophilic arthropathy in the overall population. CS846 levels were significantly correlated with MRI scores in patients treated on demand. (Less

    Does rapid HIV disease progression prior to combination antiretroviral therapy hinder optimal CD4 + T-cell recovery once HIV-1 suppression is achieved?

    No full text
    Objective: This article compares trends in CD4+ T-cell recovery and proportions achieving optimal restoration (>=500 cells/µl) after viral suppression following combination antiretroviral therapy (cART) initiation between rapid and nonrapid progressors. Methods: We included HIV-1 seroconverters achieving viral suppression within 6 months of cART. Rapid progressors were individuals experiencing at least one CD4+ less than 200 cells/µl within 12 months of seroconverters before cART. We used piecewise linear mixed models and logistic regression for optimal restoration. Results: Of 4024 individuals, 294 (7.3%) were classified as rapid progressors. At the same CD4+ T-cell count at cART start (baseline), rapid progressors experienced faster CD4+ T-cell increases than nonrapid progressors in first month [difference (95% confidence interval) in mean increase/month (square root scale): 1.82 (1.61; 2.04)], which reversed to slightly slower increases in months 1–18 [-0.05 (-0.06; -0.03)] and no significant differences in 18–60 months [-0.003 (-0.01; 0.01)]. Percentage achieving optimal restoration was significantly lower for rapid progressors than nonrapid progressors at months 12 (29.2 vs. 62.5%) and 36 (47.1 vs. 72.4%) but not at month 60 (70.4 vs. 71.8%). These differences disappeared after adjusting for baseline CD4+ T-cell count: odds ratio (95% confidence interval) 0.86 (0.61; 1.20), 0.90 (0.38; 2.17) and 1.56 (0.55; 4.46) at months 12, 36 and 60, respectively. Conclusion: Among people on suppressive antiretroviral therapy, rapid progressors experience faster initial increases of CD4+ T-cell counts than nonrapid progressors, but are less likely to achieve optimal restoration during the first 36 months after cART, mainly because of lower CD4+ T-cell counts at cART initiation

    Effect and process evaluation of a kindergarten-based, family-involved cluster randomised controlled trial in six European countries on four- to six-year-old children's steps per day: The ToyBox-study

    Get PDF
    Background: The ToyBox-intervention is a theory- and evidence-based intervention delivered in kindergartens to improve four- to six-year-old children''s energy balance-related behaviours and prevent obesity. The current study aimed to (1) examine the effect of the ToyBox-intervention on increasing European four- to six-year-old children'' steps per day, and (2) examine if a higher process evaluation score from teachers and parents was related to a more favourable effect on steps per day. Methods: A sample of 2438 four- to six-year-old children (51.9% boys, mean age 4.75±0.43years) from 6 European countries (Belgium, Bulgaria, Germany, Greece, Poland and Spain) wore a motion sensor (pedometer or accelerometer) for a minimum of two weekdays and one weekend day both at baseline and follow-up to objectively measure their steps per day. Kindergarten teachers implemented the physical activity component of the ToyBox-intervention for 6 weeks in total, with a focus on (1) environmental changes in the classroom, (2) the child performing the actual behaviour and (3) classroom activities. Children''s parents received newsletters, tip cards and posters. To assess intervention effects, multilevel repeated measures analyses were conducted for the total sample and the six intervention countries separately. In addition, process evaluation questionnaires were used to calculate a total process evaluation score (with implementation and satisfaction as a part of the overall score) for teachers and parents which was then linked with the physical activity outcomes. Results: No significant intervention effects on four- to six-year-old children'' steps per weekday, steps per weekend day and steps per average day were found, both in the total sample and in the country-specific samples (all p>0.05). In general, the intervention effects on steps per day were least favourable in four- to six-year-old children with a low teachers process evaluation score and most favourable in four- to six-year-old children with a high teachers process evaluation score. No differences in intervention effects were found for a low, medium or high parents'' process evaluation score. Conclusion: The physical activity component of the ToyBox-intervention had no overall effect on four- to six-year-old children'' steps per day. However, the process evaluation scores showed that kindergarten teachers that implemented the physical activity component of the ToyBox-intervention as planned and were satisfied with the physical activity component led to favourable effects on children''s steps per day. Strategies to motivate, actively involve and engage the kindergarten teachers and parents/caregivers are needed to induce larger effects

    Role of Basal Ganglia Circuits in Resisting Interference by Distracters: A swLORETA Study

    Get PDF
    BACKGROUND: The selection of task-relevant information requires both the focalization of attention on the task and resistance to interference from irrelevant stimuli. Both mechanisms rely on a dorsal frontoparietal network, while focalization additionally involves a ventral frontoparietal network. The role of subcortical structures in attention is less clear, despite the fact that the striatum interacts significantly with the frontal cortex via frontostriatal loops. One means of investigating the basal ganglia's contributions to attention is to examine the features of P300 components (i.e. amplitude, latency, and generators) in patients with basal ganglia damage (such as in Parkinson's disease (PD), in which attention is often impaired). Three-stimulus oddball paradigms can be used to study distracter-elicited and target-elicited P300 subcomponents. METHODOLOGY/PRINCIPAL FINDINGS: In order to compare distracter- and target-elicited P300 components, high-density (128-channel) electroencephalograms were recorded during a three-stimulus visual oddball paradigm in 15 patients with early PD and 15 matched healthy controls. For each subject, the P300 sources were localized using standardized weighted low-resolution electromagnetic tomography (swLORETA). Comparative analyses (one-sample and two-sample t-tests) were performed using SPM5® software. The swLORETA analyses showed that PD patients displayed fewer dorsolateral prefrontal (DLPF) distracter-P300 generators but no significant differences in target-elicited P300 sources; this suggests dysfunction of the DLPF cortex when the executive frontostriatal loop is disrupted by basal ganglia damage. CONCLUSIONS/SIGNIFICANCE: Our results suggest that the cortical attention frontoparietal networks (mainly the dorsal one) are modulated by the basal ganglia. Disruption of this network in PD impairs resistance to distracters, which results in attention disorders

    Climate change and epilepsy: insights from clinical and basic science studies

    Get PDF
    Climate change is with us. As professionals who place value on evidence-based practice, climate change is something we cannot ignore. The current pandemic of the novel coronavirus, SARS-CoV-2, has demonstrated how global crises can arise suddenly and have a significant impact on public health. Global warming, a chronic process punctuated by acute episodes of extreme weather events, is an insidious global health crisis needing at least as much attention. Many neurological diseases are complex chronic conditions influenced at many levels by changes in the environment. This review aimed to collate and evaluate reports from clinical and basic science about the relationship between climate change and epilepsy. The keywords climate change, seasonal variation, temperature, humidity, thermoregulation, biorhythm, gene, circadian rhythm, heat, and weather were used to search the published evidence. A number of climatic variables are associated with increased seizure frequency in people with epilepsy. Climate change-induced increase in seizure precipitants such as fevers, stress, and sleep deprivation (e.g. as a result of more frequent extreme weather events) or vector-borne infections may trigger or exacerbate seizures, lead to deterioration of seizure control, and affect neurological, cerebrovascular, or cardiovascular comorbidities and risk of sudden unexpected death in epilepsy. Risks are likely to be modified by many factors, ranging from individual genetic variation and temperature-dependent channel function, to housing quality and global supply chains. According to the results of the limited number of experimental studies with animal models of seizures or epilepsy, different seizure types appear to have distinct susceptibility to seasonal influences. Increased body temperature, whether in the context of fever or not, has a critical role in seizure threshold and seizure-related brain damage. Links between climate change and epilepsy are likely to be multifactorial, complex, and often indirect, which makes predictions difficult. We need more data on possible climate-driven altered risks for seizures, epilepsy, and epileptogenesis, to identify underlying mechanisms at systems, cellular, and molecular levels for better understanding of the impact of climate change on epilepsy. Further focussed data would help us to develop evidence for mitigation methods to do more to protect people with epilepsy from the effects of climate change. (C) 2021 Elsevier Inc. All rights reserved.Paroxysmal Cerebral Disorder

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    The concordance of the limiting antigen and the Bio-Rad avidity assays in persons from Estonia infected mainly with HIV-1 CRF06_cpx

    Get PDF
    BACKGROUND: Serological assays to determine HIV incidence have contributed to estimates of HIV incidence, monitoring of HIV spread, and evaluation of prevention strategies. Two frequently used incidence assays are the Sedia HIV-1 LAg-Avidity EIA (LAg) and the Bio-Rad avidity incidence (BRAI) assays with a mean duration of recent infection (MDRI) of 130 and 240 days for subtype B infections, respectively. Little is known about how these assays perform with recombinant HIV-1 strains. We evaluated the concordance of these assays in a population infected mainly with HIV-1 CRF06_cpx. MATERIAL/METHODS: Remnant serum samples (n = 288) collected from confirmed, newly-diagnosed HIV-positive persons from Estonia in 2013 were tested. Demographic and clinical data were extracted from clinical databases. LAg was performed according to the manufacturer’s protocol and BRAI testing was done using a validated protocol. Samples with LAg-pending or BRAI-invalid results were reclassified as recent if they were from persons with viral loads <1000 copies/mL or were reclassified as long-term if presenting with AIDS. RESULTS: In total 325 new HIV infections were diagnosed in 2013 in Estonia. Of those 276 persons were tested with both LAg and BRAI. Using assay results only, the recency rate was 44% and 70% by LAg and BRAI, respectively. The majority of samples (92%) recent by LAg were recent by BRAI. Similarly, 89% of samples long-term by BRAI were long-term by LAg. After clinical information was included in the analysis, the recency rate was 44% and 62% for LAg and BRAI, respectively. The majority of samples (86%) recent by LAg were recent by BRAI and 91% of long-term infections by BRAI were long-term by LAg. CONCLUSIONS: Comparison of LAg and BRAI results in this mostly CRF06_cpx-infected population showed good concordance for incidence classification. Our finding of a higher recency rate with BRAI in this population is likely related to the longer MDRI for this assay
    corecore