21 research outputs found

    Measuring routine childhood vaccination coverage in 204 countries and territories, 1980-2019 : a systematic analysis for the Global Burden of Disease Study 2020, Release 1

    Get PDF
    Background Measuring routine childhood vaccination is crucial to inform global vaccine policies and programme implementation, and to track progress towards targets set by the Global Vaccine Action Plan (GVAP) and Immunization Agenda 2030. Robust estimates of routine vaccine coverage are needed to identify past successes and persistent vulnerabilities. Drawing from the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2020, Release 1, we did a systematic analysis of global, regional, and national vaccine coverage trends using a statistical framework, by vaccine and over time. Methods For this analysis we collated 55 326 country-specific, cohort-specific, year-specific, vaccine-specific, and dosespecific observations of routine childhood vaccination coverage between 1980 and 2019. Using spatiotemporal Gaussian process regression, we produced location-specific and year-specific estimates of 11 routine childhood vaccine coverage indicators for 204 countries and territories from 1980 to 2019, adjusting for biases in countryreported data and reflecting reported stockouts and supply disruptions. We analysed global and regional trends in coverage and numbers of zero-dose children (defined as those who never received a diphtheria-tetanus-pertussis [DTP] vaccine dose), progress towards GVAP targets, and the relationship between vaccine coverage and sociodemographic development. Findings By 2019, global coverage of third-dose DTP (DTP3; 81.6% [95% uncertainty interval 80.4-82 .7]) more than doubled from levels estimated in 1980 (39.9% [37.5-42.1]), as did global coverage of the first-dose measles-containing vaccine (MCV1; from 38.5% [35.4-41.3] in 1980 to 83.6% [82.3-84.8] in 2019). Third- dose polio vaccine (Pol3) coverage also increased, from 42.6% (41.4-44.1) in 1980 to 79.8% (78.4-81.1) in 2019, and global coverage of newer vaccines increased rapidly between 2000 and 2019. The global number of zero-dose children fell by nearly 75% between 1980 and 2019, from 56.8 million (52.6-60. 9) to 14.5 million (13.4-15.9). However, over the past decade, global vaccine coverage broadly plateaued; 94 countries and territories recorded decreasing DTP3 coverage since 2010. Only 11 countries and territories were estimated to have reached the national GVAP target of at least 90% coverage for all assessed vaccines in 2019. Interpretation After achieving large gains in childhood vaccine coverage worldwide, in much of the world this progress was stalled or reversed from 2010 to 2019. These findings underscore the importance of revisiting routine immunisation strategies and programmatic approaches, recentring service delivery around equity and underserved populations. Strengthening vaccine data and monitoring systems is crucial to these pursuits, now and through to 2030, to ensure that all children have access to, and can benefit from, lifesaving vaccines. Copyright (C) 2021 The Author(s). Published by Elsevier Ltd.Peer reviewe

    Abstracts from the 3rd International Genomic Medicine Conference (3rd IGMC 2015)

    Get PDF

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Analysis of long term meteorological trends in the middle and lower Indus Basin of Pakistan: a non-parametric statistical approach

    No full text
    The Indus basin of Pakistan is vulnerable to climate change which would directly affect the livelihoods of poor people engaged in irrigated agriculture. The situation could be worse in middle and lower part of this basin which occupies 90% of the irrigated area. The objective of this research is to analyze the long term meteorological trends in the middle and lower parts of Indus basin of Pakistan. We used monthly data from 1971 to 2010 and applied non-parametric seasonal Kendal test for trend detection in combination with seasonal Kendall slope estimator to quantify the magnitude of trends. The meteorological parameters considered were mean maximum and mean minimum air temperature, and rainfall from 12 meteorological stations located in the study region. We examined the reliability and spatial integrity of data by mass-curve analysis and spatial correlation matrices, respectively. Analysis was performed for four seasons (spring—March to May, summer—June to August, fall—September to November and winter—December to February). The results show that max. temperature has an average increasing trend of magnitude +0.16, +0.03, 0.0 and +0.04 °C/decade during all the four seasons, respectively. The average trend of min. temperature during the four seasons also increases with magnitude of +0.29, +0.12, +0.36 and +0.36 °C/decade, respectively. Persistence of the increasing trend is more pronounced in the min. temperature as compared to the max. temperature on annual basis. Analysis of rainfall data has not shown any noteworthy trend during winter, fall and on annual basis. However during spring and summer season, the rainfall trends vary from -1.15 to +0.93 and -3.86 to +2.46 mm/decade, respectively. It is further revealed that rainfall trends during all seasons are statistically non-significant. Overall the study area is under a significant warming trend with no changes in rainfall

    Optimal frequency of scans for patients on cancer therapies: A population kinetics assessment

    No full text
    Abstract Background Optimal frequency of follow‐up scans for patients receiving systemic therapies is poorly defined. Progression‐free survival (PFS) generally follows first‐order kinetics. We used exponential decay nonlinear regression analysis to calculate half‐lives for 887 published PFS curves. Method We used the Excel formula x = EXP(‐tn*0.693/t1/2) to calculate proportion of residual patients remaining progression‐free at different times, where tn is the interval in weeks between scans (eg, 6 weeks), * indicates multiplication, 0.693 is the natural logarithm of 2, and t1/2 is the PFS half‐life in weeks. Results Proportion of residual patients predicted to remain progression‐free at each subsequent scan varied with scan intervals and regimen PFS half‐life. For example, with a 4‐month half‐life (17.3 weeks) and scans every 6 weeks, 21% of patients would progress by the first scan, 21% of the remaining patients would progress by the second scan at 12 weeks, etc With 2, 6‐ and 12‐month half‐lives (for example), the proportion of remaining patients progressing at each subsequent scan if repeated every 3 weeks would be 21%, 8% and 4%, respectively, while with scans every 12 weeks it would be 62%, 27% and 15%, respectively. Furthermore, optimal scan frequency can be calculated for populations comprised of distinct rapidly and slowly progressing subpopulations, as well as with convex curves arising from treatment breaks, where optimal scan frequency may differ during therapy administration vs during more rapid progression after therapy interruption. Conclusions A population kinetics approach permits a regimen‐ and tumor‐specific determination of optimal scan frequency for patients on systemic therapies

    INTERNATIONAL JOURNAL OF AGRICULTURE & BIOLOGY Full Length Article Evaluation of a Saponin Adjuvanted Inactivated Mycoplasma bovis (A Field Isolate from Cattle Lungs in Balochistan, Pakistan) Vaccine

    No full text
    Abstract An inactivated saponin adjuvanted vaccine was prepared from Mycoplasma bovis local field isolate and its efficacy was evaluated. Nine calves (n=9) were split in three groups (three calves in each group). Calves in group A were vaccinated with M. bovis saponin adjuvanted (inactivated) vaccine and challenged with live M. bovis strain by nasal spray at day 21 post vaccination. Calves in group B were only challenged with live M. bovis culture through same route and Group C was kept as control. All groups of calves were monitored for 7 weeks. The antibody profile of all vaccinated and challenged animal were assessed through IHA test. The saponated M. bovis (inactivated) vaccines with protein concentration (2 mg/mL) was found very effective. Any pathological lesion, mortality and any other clinical manifestation was not observed in vaccinated group of calves. Over all the immune status with a GMT (40.3) was found satisfactory with this vaccine, which was achieved 3 weeks post-vaccination and this titer has risen to a GMT (80.6) after four week and was maintained to a GMT of (64.0) on 49 th days at the end of experiment

    A Randomized Trial Comparing 3- versus 4-Monthly Cardiac Monitoring in Patients Receiving Trastuzumab-Based Chemotherapy for Early Breast Cancer

    No full text
    Purpose: The optimal frequency for cardiac monitoring of left ventricular ejection fraction (LVEF) in patients receiving trastuzumab-based therapy for early breast cancer (EBC) is unknown. We conducted a randomized controlled trial comparing 3- versus 4-monthly cardiac monitoring. Patients and Method: Patients scheduled to receive trastuzumab-containing cancer therapy for EBC with normal (>53%) baseline LVEF were randomized to undergo LVEF assessments every 3 or 4 months. The primary outcome was the change in LVEF from baseline. Secondary outcomes included the rate of cardiac dysfunction (defined as a decrease in the LVEF of ≥10 percentage points, to a value <53%), delays in or discontinuation of trastuzumab therapy, and cardiology referral. Results: Of the 200 eligible and enrolled patients, 100 (50%) were randomized to 3-monthly and 100 (50%) to 4-monthly cardiac monitoring. Of these patients, 98 and 97 respectively underwent at least one cardiac scan. The estimated mean difference in LVEF from baseline was −0.94% (one-sided 95% lower bound: −2.14), which exceeded the pre-defined non-inferiority margin of −4%. There were also no significant differences between the two study arms for any of the secondary endpoints. The rate of detection of cardiac dysfunction was 16.3% (16/98) and 12.4% (12/97) in the 3- and 4-monthly arms, respectively (95% CI: 4.0 [−5.9, 13.8]). Conclusions: Cardiac monitoring every 4 months was deemed non-inferior to that every 3 months in patients with HER2-positive EBC being treated with trastuzumab-based therapy. Given its costs and inconvenience, cardiac monitoring every 4 months should be considered standard practice. Registration: NCT02696707, 18 February 2016
    corecore