130 research outputs found

    Moderate Antiproteinuric Effect of Add-On Aldosterone Blockade with Eplerenone in Non-Diabetic Chronic Kidney Disease. A Randomized Cross-Over Study

    Get PDF
    Reduction of proteinuria and blood pressure (BP) with blockers of the renin-angiotensin system (RAS) impairs the progression of chronic kidney disease (CKD). The aldosterone antagonist spironolactone has an antiproteinuric effect, but its use is limited by side effects. The present study evaluated the short-term antiproteinuric effect and safety of the selective aldosterone antagonist eplerenone in non-diabetic CKD.Open randomized cross-over trial.Forty patients with non-diabetic CKD and urinary albumin excretion greater than 300 mg/24 hours.Eight weeks of once-daily administration of add-on 25–50 mg eplerenone to stable standard antihypertensive treatment including RAS-blockade.24 hour urinary albumin excretion, BP, p-potassium, and creatinine clearance.The mean urinary albumin excretion was 22% [CI: 14,28], P<0.001, lower during treatment with eplerenone. Mean systolic BP was 4 mmHg [CI: 2,6], P = 0.002, diastolic BP was 2 mmHg [CI: 0,4], P = 0.02, creatinine clearance was 5% [CI: 2,8], P = 0.005, lower during eplerenone treatment. After correction for BP and creatinine clearance differences between the study periods, the mean urinary albumin excretion was 14% [CI: 4,24], P = 0.008 lower during treatment. Mean p-potassium was 0.1 mEq/L [CI: 0.1,0.2] higher during eplerenone treatment, P<0.001. Eplerenone was thus well tolerated and no patients were withdrawn due to hyperkalaemia.Open label, no wash-out period and a moderate sample size.In non-diabetic CKD patients, the addition of eplerenone to standard antihypertensive treatment including RAS-blockade caused a moderate BP independent fall in albuminuria, a minor fall in creatinine clearance and a 0.1 mEq/L increase in p-potassium

    Minimal clinically important differences for patient-reported outcome measures of fatigue in patients with COPD after pulmonary rehabilitation

    Get PDF
    Fatigue is a burdensome and prevailing symptom in patients with chronic obstructive pulmonary disease (COPD). Pulmonary rehabilitation (PR) improves fatigue however, interpreting when such improvement is clinically relevant is challenging. Minimal clinically important differences (MCIDs) for instruments assessing fatigue are warranted to better tailor PR and guide clinical decisions. We estimated MCIDs for the functional assessment of chronic illness therapy-fatigue subscale (FACIT-FS), the modified-FACIT-FS and the checklist of individual strength-fatigue subscale (CIS-FS), in patients with COPD after PR.publishe

    Poverty, dirt, infections and non-atopic wheezing in children from a Brazilian urban center

    Get PDF
    BACKGROUND: The causation of asthma is poorly understood. Risk factors for atopic and non-atopic asthma may be different. This study aimed to analyze the associations between markers of poverty, dirt and infections and wheezing in atopic and non-atopic children. METHODS: 1445 children were recruited from a population-based cohort in Salvador, Brazil. Wheezing was assessed using the ISAAC questionnaire and atopy defined as allergen-specific IgE ≥ 0.70 kU/L. Relevant social factors, environmental exposures and serological markers for childhood infections were investigated as risk factors using multivariate multinomial logistic regression. RESULTS: Common risk factors for wheezing in atopic and non-atopic children, respectively, were parental asthma and respiratory infection in early childhood. No other factor was associated with wheezing in atopic children. Factors associated with wheezing in non-atopics were low maternal educational level (OR 1.49, 95% CI 0.98-2.38), low frequency of room cleaning (OR 2.49, 95% CI 1.27-4.90), presence of rodents in the house (OR 1.48, 95% CI 1.06-2.09), and day care attendance (OR 1.52, 95% CI 1.01-2.29). CONCLUSIONS: Non-atopic wheezing was associated with risk factors indicative of poverty, dirt and infections. Further research is required to more precisely define the mediating exposures and the mechanisms by which they may cause non-atopic wheeze

    How a Diverse Research Ecosystem Has Generated New Rehabilitation Technologies: Review of NIDILRR’s Rehabilitation Engineering Research Centers

    Get PDF
    Over 50 million United States citizens (1 in 6 people in the US) have a developmental, acquired, or degenerative disability. The average US citizen can expect to live 20% of his or her life with a disability. Rehabilitation technologies play a major role in improving the quality of life for people with a disability, yet widespread and highly challenging needs remain. Within the US, a major effort aimed at the creation and evaluation of rehabilitation technology has been the Rehabilitation Engineering Research Centers (RERCs) sponsored by the National Institute on Disability, Independent Living, and Rehabilitation Research. As envisioned at their conception by a panel of the National Academy of Science in 1970, these centers were intended to take a “total approach to rehabilitation”, combining medicine, engineering, and related science, to improve the quality of life of individuals with a disability. Here, we review the scope, achievements, and ongoing projects of an unbiased sample of 19 currently active or recently terminated RERCs. Specifically, for each center, we briefly explain the needs it targets, summarize key historical advances, identify emerging innovations, and consider future directions. Our assessment from this review is that the RERC program indeed involves a multidisciplinary approach, with 36 professional fields involved, although 70% of research and development staff are in engineering fields, 23% in clinical fields, and only 7% in basic science fields; significantly, 11% of the professional staff have a disability related to their research. We observe that the RERC program has substantially diversified the scope of its work since the 1970’s, addressing more types of disabilities using more technologies, and, in particular, often now focusing on information technologies. RERC work also now often views users as integrated into an interdependent society through technologies that both people with and without disabilities co-use (such as the internet, wireless communication, and architecture). In addition, RERC research has evolved to view users as able at improving outcomes through learning, exercise, and plasticity (rather than being static), which can be optimally timed. We provide examples of rehabilitation technology innovation produced by the RERCs that illustrate this increasingly diversifying scope and evolving perspective. We conclude by discussing growth opportunities and possible future directions of the RERC program

    Upper limb impairments associated with spasticity in neurological disorders

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>While upper-extremity movement in individuals with neurological disorders such as stroke and spinal cord injury (SCI) has been studied for many years, the effects of spasticity on arm movement have been poorly quantified. The present study is designed to characterize the nature of impaired arm movements associated with spasticity in these two clinical populations. By comparing impaired voluntary movements between these two groups, we will gain a greater understanding of the effects of the type of spasticity on these movements and, potentially a better understanding of the underlying impairment mechanisms.</p> <p>Methods</p> <p>We characterized the kinematics and kinetics of rapid arm movement in SCI and neurologically intact subjects and in both the paretic and non-paretic limbs in stroke subjects. The kinematics of rapid elbow extension over the entire range of motion were quantified by measuring movement trajectory and its derivatives; i.e. movement velocity and acceleration. The kinetics were quantified by measuring maximum isometric voluntary contractions of elbow flexors and extensors. The movement smoothness was estimated using two different computational techniques.</p> <p>Results</p> <p>Most kinematic and kinetic and movement smoothness parameters changed significantly in paretic as compared to normal arms in stroke subjects (p < 0.003). Surprisingly, there were no significant differences in these parameters between SCI and stroke subjects, except for the movement smoothness (p ≤ 0.02). Extension was significantly less smooth in the paretic compared to the non-paretic arm in the stroke group (p < 0.003), whereas it was within the normal range in the SCI group. There was also no significant difference in these parameters between the non-paretic arm in stroke subjects and the normal arm in healthy subjects.</p> <p>Conclusion</p> <p>The findings suggest that although the cause and location of injury are different in spastic stroke and SCI subjects, the impairments in arm voluntary movement were similar in the two spastic groups. Our results also suggest that the non-paretic arm in stroke subjects was not distinguishable from the normal, and might therefore be used as an appropriate control for studying movement of the paretic arm.</p

    Course and prognosis of recovery for chronic non-specific low back pain: design, therapy program and baseline data of a prospective cohort study

    Get PDF
    Background: There has been increasing focus on factors predicting the development of chronic musculoskeletal disorders. For patients already experiencing chronic non-specific low back pain it is also relevant to investigate which prognostic factors predict recovery. We present the design of a cohort study that aims to determine the course and prognostic factors for recovery in patients with chronic non-specific low back pain. Methods/Design. All participating patients were recruited (Jan 2003-Dec 2008) from the same rehabilitation centre and were evaluated by means of (postal) questionnaires and physical examinations at baseline, during the 2-month therapy program, and at 5 and 12 months after start of therapy. The therapy protocol at the rehabilitation centre used a bio-psychosocial approach to stimulate patients to adopt adequate (movement) behaviour aimed at physical and functional recovery. The program is part of regular care and consists of 16 sessions of 3 hours each, over an 8-week period (in total 48 hours), followed by a 3-month self-management program. The primary outcomes are low back pain intensity, disability, quality of life, patient's global perceived effect of recovery, and participation in work. Baseline characteristics include information on socio-demographics, low back pain, employment status, and additional clinical items status such as fatigue, duration of activities, and fear of kinesiophobia. Prognostic variables are determined for recovery at short-term (5 months) and long-term (12 months) follow-up after start of therapy. Discussion. In a routine clinical setting it is important to provide patients suffering from chronic non-specific low back pain with adequate information about the prognosis of their complaint

    Chronic kidney disease in children: the global perspective

    Get PDF
    In contrast to the increasing availability of information pertaining to the care of children with chronic kidney disease (CKD) from large-scale observational and interventional studies, epidemiological information on the incidence and prevalence of pediatric CKD is currently limited, imprecise, and flawed by methodological differences between the various data sources. There are distinct geographic differences in the reported causes of CKD in children, in part due to environmental, racial, genetic, and cultural (consanguinity) differences. However, a substantial percentage of children develop CKD early in life, with congenital renal disorders such as obstructive uropathy and aplasia/hypoplasia/dysplasia being responsible for almost one half of all cases. The most favored end-stage renal disease (ESRD) treatment modality in children is renal transplantation, but a lack of health care resources and high patient mortality in the developing world limits the global provision of renal replacement therapy (RRT) and influences patient prevalence. Additional efforts to define the epidemiology of pediatric CKD worldwide are necessary if a better understanding of the full extent of the problem, areas for study, and the potential impact of intervention is desired

    The General Transcriptional Repressor Tup1 Is Required for Dimorphism and Virulence in a Fungal Plant Pathogen

    Get PDF
    A critical step in the life cycle of many fungal pathogens is the transition between yeast-like growth and the formation of filamentous structures, a process known as dimorphism. This morphological shift, typically triggered by multiple environmental signals, is tightly controlled by complex genetic pathways to ensure successful pathogenic development. In animal pathogenic fungi, one of the best known regulators of dimorphism is the general transcriptional repressor, Tup1. However, the role of Tup1 in fungal dimorphism is completely unknown in plant pathogens. Here we show that Tup1 plays a key role in orchestrating the yeast to hypha transition in the maize pathogen Ustilago maydis. Deletion of the tup1 gene causes a drastic reduction in the mating and filamentation capacity of the fungus, in turn leading to a reduced virulence phenotype. In U. maydis, these processes are controlled by the a and b mating-type loci, whose expression depends on the Prf1 transcription factor. Interestingly, Δtup1 strains show a critical reduction in the expression of prf1 and that of Prf1 target genes at both loci. Moreover, we observed that Tup1 appears to regulate Prf1 activity by controlling the expression of the prf1 transcriptional activators, rop1 and hap2. Additionally, we describe a putative novel prf1 repressor, named Pac2, which seems to be an important target of Tup1 in the control of dimorphism and virulence. Furthermore, we show that Tup1 is required for full pathogenic development since tup1 deletion mutants are unable to complete the sexual cycle. Our findings establish Tup1 as a key factor coordinating dimorphism in the phytopathogen U. maydis and support a conserved role for Tup1 in the control of hypha-specific genes among animal and plant fungal pathogens

    Schroth physiotherapeutic scoliosis-specific exercises for adolescent idiopathic scoliosis: how many patients require treatment to prevent one deterioration? – results from a randomized controlled trial - “SOSORT 2017 Award Winner”

    Get PDF
    Abstract Background Recent randomized controlled trials (RCTs) support using physiotherapeutic scoliosis-specific exercises (PSSE) for adolescents with idiopathic scoliosis (AIS). All RCTs reported statistically significant results favouring PSSE but none reported on clinical significance. The number needed to treat (NNT) helps determine if RCT results are clinically meaningful. The NNT is the number of patients that need to be treated to prevent one bad outcome in a given period. A low NNT suggests that a therapy has positive outcomes in most patients offered the therapy. The objective was to determine how many patients require Schroth PSSE added to standard care (observation or brace treatment) to prevent one progression (NNT) of the Largest Curve (LC) or Sum of Curves (SOC) beyond 5° and 10°, respectively over a 6-month interval. Methods This was a secondary analysis of a RCT. Fifty consecutive participants from a scoliosis clinic were randomized to the Schroth PSSE + standard of care group (n = 25) or the standard of care group (n = 25). We included males and females with AIS, age 10–18 years, all curve types, with curves 10°- 45°, with or without brace, and all maturity levels. We excluded patients awaiting surgery, having had surgery, having completed brace treatment and with other scoliosis diagnoses. The local ethics review board approved the study (Pro00011552). The Schroth intervention consisted of weekly 1-h supervised Schroth PSSE sessions and a daily home program delivered over six months in addition to the standard of care. A prescription algorithm was used to determine which exercises patients were to perform. Controls received only standard of care. Cobb angles were measured using a semi-automatic system from posterior-anterior standing radiographs at baseline and 6 months. We calculated absolute risk reduction (ARR) and relative risk reduction (RRR). The NTT was calculated as: NNT = 1/ARR. Patients with missing values (PSSE group; n = 2 and controls; n = 4) were assumed to have had curve progression (worst case scenario). The RRR is calculated as RRR = ARR/CER Results For LC, NNT = 3.6 (95% CI 2.0–28.2), and for SOC, NNT = 3.1 (95% CI 1.9–14.2). The corresponding ARR was 28% for LC and 32% for the SOC. The RRR was 70% for LC and 73% for the SOC. Patients with complete follow-up attended 85% of prescribed visits and completed 82.5% of the home program. Assuming zero compliance after dropout, 76% of visits were attended and 73% of the prescribed home exercises were completed. Conclusions The short term of Schroth PSSE intervention added to standard care provided a large benefit as compared to standard care alone. Four (LC and SOC) patients require treatment for the additional benefit of a 6-month long Schroth intervention to be observed beyond the standard of care in at least one patient. Trial registration NCT01610908 April 2, 201
    corecore