113 research outputs found

    Primary Care Management of Asthma Exacerbations or Attacks: Impact of the COVID-19 Pandemic

    Get PDF
    The COVID-19 pandemic has brought a renewed focus on appropriate management of chronic respiratory conditions with a heightened awareness of respiratory symptoms and the requirement for differential diagnosis between an asthma attack and COVID-19 infection. Despite early concerns in the pandemic, most studies suggest that well-managed asthma is not a risk factor for more severe COVID-related outcomes, and that asthma may even have a protective effect. Advice on the treatment of asthma and asthma attacks has remained unchanged. This article describes some challenges faced in primary care asthma management in adults and in teenagers, particularly their relevance during a pandemic, and provides practical advice on asthma attack recognition, classification, treatment and continuity of care. Acute attacks, characterised by increased symptoms and reduced lung function, are often referred to as exacerbations of asthma by doctors and nurses but are usually described by patients as asthma attacks. They carry a significant and underestimated morbidity and mortality burden. Many patients experiencing an asthma attack are assessed in primary care for treatment and continuing management. This may require remote assessment by telephone and home monitoring devices, where available, during a pandemic. Differentiation between an asthma attack and a COVID-19 infection requires a structured clinical assessment, taking account of previous medical and family history. Early separation into mild, moderate, severe or life-threatening attacks is helpful for continuing good management. Most attacks can be managed in primary care but when severe or unresponsive to initial treatment, the patient should be appropriately managed until transfer to an acute care facility can be arranged. Good quality care is important to prevent further attacks and must include a follow-up appointment in primary care, proactive regular dosing with daily controller therapy and an understanding of a patient’s beliefs and perceptions about asthma to maximise future self-management

    Annual direct medical cost of active systemic lupus erythematosus in five European countries.

    Get PDF
    OBJECTIVES: To evaluate the annual direct medical cost of managing adult systemic lupus erythematosus (SLE) patients with active autoantibody positive disease in Europe. METHODS: A 2-year, retrospective, multicentre, observational study was conducted in five countries (France, Germany, Italy, Spain and the UK). Data included patients' characteristics, disease activity and severity, flare assessments and health resource use (eg, laboratory tests, medications, specialist visits and hospitalisations). Costs were assessed from the public payers' perspective. Cost predictors were estimated by multivariate regression models. RESULTS: Thirty-one centres enrolled 427 consecutive eligible patients stratified equally by disease severity. At baseline, mean (SD) age was 44.5 (13.8) years, 90.5% were women and mean (SD) SLE duration was 10.7 (8.0) years. The SELENA-SLEDAI (11.2 vs 5.3) and SLICC/ACR index (1.0 vs 0.7) scores were higher in severe patients. Over the study period, patients experienced on average 1.02 (0.71) flares/year. The mean annual direct medical cost was higher in severe compared to non-severe patients ( 4748 vs 2650, p<0.001). Medication costs were 2518 in severe versus 1251 in non-severe patients (p<0.001). Medications represented 53% and 47% of the total cost for severe and non-severe patients, respectively, primarily due to immunosuppressants and biologics. Flares, especially severe flares, were identified as the major cost predictor, with each flare increasing the annual total cost by about 1002 (p<0.001). CONCLUSIONS: The annual direct medical cost of SLE patients in Europe is related to disease severity and flares. Medical treatments were the main cost drivers. Severe flares and major organ involvement were identified as important cost predictors

    The Origin and Initial Rise of Pelagic Cephalopods in the Ordovician

    Get PDF
    BACKGROUND: During the Ordovician the global diversity increased dramatically at family, genus and species levels. Partially the diversification is explained by an increased nutrient, and phytoplankton availability in the open water. Cephalopods are among the top predators of today's open oceans. Their Ordovician occurrences, diversity evolution and abundance pattern potentially provides information on the evolution of the pelagic food chain. METHODOLOGY/PRINCIPAL FINDINGS: We reconstructed the cephalopod departure from originally exclusively neritic habitats into the pelagic zone by the compilation of occurrence data in offshore paleoenvironments from the Paleobiology Database, and from own data, by evidence of the functional morphology, and the taphonomy of selected cephalopod faunas. The occurrence data show, that cephalopod associations in offshore depositional settings and black shales are characterized by a specific composition, often dominated by orthocerids and lituitids. The siphuncle and conch form of these cephalopods indicate a dominant lifestyle as pelagic, vertical migrants. The frequency distribution of conch sizes and the pattern of epibionts indicate an autochthonous origin of the majority of orthocerid and lituitid shells. The consistent concentration of these cephalopods in deep subtidal sediments, starting from the middle Tremadocian indicates the occupation of the pelagic zone early in the Early Ordovician and a subsequent diversification which peaked during the Darriwilian. CONCLUSIONS/SIGNIFICANCE: The exploitation of the pelagic realm started synchronously in several independent invertebrate clades during the latest Cambrian to Middle Ordovician. The initial rise and diversification of pelagic cephalopods during the Early and Middle Ordovician indicates the establishment of a pelagic food chain sustainable enough for the development of a diverse fauna of large predators. The earliest pelagic cephalopods were slowly swimming vertical migrants. The appearance and early diversification of pelagic cephalopods is interpreted as a consequence of the increased food availability in the open water since the latest Cambrian

    Coordinated stasis or coincident relative stability?

    No full text

    Salbutamol use in relation to maintenance bronchodilator efficacy in COPD : a prospective subgroup analysis of the EMAX trial

    No full text
    Background: Short-acting β2-agonist (SABA) bronchodilators help alleviate symptoms in chronic obstructive pulmonary disease (COPD) and may be a useful marker of symptom severity. This analysis investigated whether SABA use impacts treatment differences between maintenance dual- and mono-bronchodilators in patients with COPD. Methods: The Early MAXimisation of bronchodilation for improving COPD stability (EMAX) trial randomised symptomatic patients with low exacerbation risk not receiving inhaled corticosteroids 1:1:1 to once-daily umeclidinium/vilanterol 62.5/25 μg, once-daily umeclidinium 62.5 μg or twice-daily salmeterol 50 μg for 24 weeks. Pre-specified subgroup analyses stratified patients by median baseline SABA use (low, < 1.5 puffs/day; high, ≥1.5 puffs/day) to examine change from baseline in trough forced expiratory volume in 1 s (FEV1), change in symptoms (Transition Dyspnoea Index [TDI], Evaluating Respiratory Symptoms-COPD [E-RS]), daily SABA use and exacerbation risk. A post hoc analysis used fractional polynomial modelling with continuous transformations of baseline SABA use covariates. Results: At baseline, patients in the high SABA use subgroup (mean: 3.91 puffs/day, n = 1212) had more severe airflow limitation, were more symptomatic and had worse health status versus patients in the low SABA use subgroup (0.39 puffs/day, n = 1206). Patients treated with umeclidinium/vilanterol versus umeclidinium demonstrated statistically significant improvements in trough FEV1 at Week 24 in both SABA subgroups (59–74 mL; p < 0.001); however, only low SABA users demonstrated significant improvements in TDI (high: 0.27 [p = 0.241]; low: 0.49 [p = 0.025]) and E-RS (high: 0.48 [p = 0.138]; low: 0.60 [p = 0.034]) scores. By contrast, significant reductions in mean SABA puffs/day with umeclidinium/vilanterol versus umeclidinium were observed only in high SABA users (high: − 0.56 [p < 0.001]; low: − 0.10 [p = 0.132]). Similar findings were observed when comparing umeclidinium/vilanterol and salmeterol. Fractional polynomial modelling showed baseline SABA use ≥4 puffs/day resulted in smaller incremental symptom improvements with umeclidinium/vilanterol versus umeclidinium compared with baseline SABA use < 4 puffs/day. Conclusions: In high SABA users, there may be a smaller difference in treatment response between dual- and mono-bronchodilator therapy; the reasons for this require further investigation. SABA use may be a confounding factor in bronchodilator trials and in high SABA users; changes in SABA use may be considered a robust symptom outcome. Funding: GlaxoSmithKline (study number 201749 [NCT03034915])
    • …
    corecore