62 research outputs found

    Medical and Paramedical Care of Patients With Cerebellar Ataxia During the COVID-19 Outbreak: Seven Practical Recommendations of the COVID 19 Cerebellum Task Force

    Get PDF
    The severe acute respiratory syndrome coronavirus 2 (SARS-CoV2), the cause of the current pandemic coronavirus disease 2019 (COVID-19), primarily targets the respiratory system. Some patients also experience neurological signs and symptoms ranging from anosmia, ageusia, headache, nausea, and vomiting to confusion, encephalitis, and stroke. Approximately 36% of those with severe COVID-19 experience neurological complications. The virus may enter the central nervous system through the olfactory nerve in the nasal cavity and damage neurons in the brainstem nuclei involved in the regulation of respiration. Patients with cerebellar ataxia (CA) are particularly vulnerable to severe outcome if they contract COVID-19 because of the complexity of their disease, the presence of comorbidities, and their use of immunosuppressive therapies. Most CA patients burdened by progressive neurologic deficits have substantially impaired mobility and other essential functions, for which they rely heavily on ambulatory services, including rehabilitation and psychosocial care. Cessation of these interventions because of isolation restrictions places the CA patient population at risk of further deterioration. This international panel of ataxia experts provides recommendations for neurologists caring for patients with CA, emphasizing a pro-active approach designed to maintain their autonomy and well-being: continue long-term medications, promote rehabilitation efforts, utilize the technology of virtual visits for regular contact with healthcare providers, and pay attention to emotional and psychosocial health. Neurologists should play an active role in decision-making in those CA cases requiring escalation to intensive care and resuscitation. Multi-disciplinary collaboration between care teams is always important, and never more so than in the context of the current pandemic

    The PHR proteins: intracellular signaling hubs in neuronal development and axon degeneration

    Get PDF

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe

    Misteri agama dan refleksi filsafat

    No full text
    232hlm:20c

    Transcendence and Immanence as Theological Categories

    No full text

    Marxs social critique of culture/ Dupre

    No full text

    Basaltic volcanism and mass extinction at the Permo-Triassic boundary: Environmental impact and modeling of the global carbon cycle

    Full text link
    The Siberian Traps represent one of the most voluminous continental flood basalt provinces on Earth. The mass extinction at the end of the Permian was the most severe in the history of life. In the present paper, these two major concurrent events that occurred are analysed and a geochemical model coupled with an energy balance model is used to calculate their environmental impact on atmospheric CO2, oceanic delta(13)C, and marine anoxia. The latitudinal temperature gradient is reduced relative to today, resulting in warmer temperatures at high latitudes. The warmer climate and the presence of fresh basaltic provinces increase the weatherability of the continental surfaces, resulting in an enhanced consumption of atmospheric CO2 through weathering. First, the eruption of the Siberian traps is accompanied by a massive volume of C-13 depleted CO2 degassed from the mantle and added to the ocean through silicate weathering, thus lowering marine delta(13)C. Second, the rapid collapse in productivity induces a strong decrease in the global organic carbon burial. This too tends to increase the proportion of light carbon in the ocean. These two effects can explain the low delta(13)C values across the PT boundary, and methane release need not be invoked to explain the delta(13)C fluctuations. It is proposed that the phosphorus cycle, which drives primary production in the model, plays an important role on the recovery of productivity and the delta(13)C variations. (c) 2005 Elsevier B.V All rights reserved

    Perifascial plane versus perineural approaches for ultrasound-guided axillary block: go to the simplest?

    No full text
    International audienceUltrasound-guided axillary block is widely used in daily practice for upper limb orthopedic surgery. A simple, safe, efficacious and time-saving technique is mandatory to optimize surgical turnover and costs. In this perspective, we compared, in a randomized, single-blinded study, a standardized perifascial technique and the selective perineural technique.Methods: Forty-two patients scheduled for elective hand surgery were randomly assigned to receive 20 mL of 10 mg/mL mepivacaine, either selectively around each of the radial, median, ulnar and musculocutaneous nerves (perineural group) or along the latissimus dorsi and superficial axillary fascia (perifascial group). The primary outcome was the procedure performance time in both groups. Secondary goals were the number of needle passes, a per-procedure evaluation of the performance on a visual analogue scale ranging from 0 to 10, the success rate and the incidence of adverse events.Results: Performance time was significantly reduced in the perifascial group (3.6 vs. 6.5 min, P<0.001), with fewer needle passes (3 vs. 6, P<0.001) and a simpler procedure performance (8.5 vs. 7.6, P=0.02). No vascular punctures or neurologic deficits were reported. Surgical anesthesia (95% in both groups) and complete anesthetic success (perifascial 81% vs. perineural 95%) were similar.Conclusions: We reported that the ultrasound-guided axillary perifascial block is easier to perform and saves procedural time compared to the classic perineural technique. Considering the same anesthetic success rate in both groups, the perifascial plane technique should be considered a daily practice technique and the first level of learning procedure for axillary block
    corecore