57 research outputs found
Evaluation of Real Time PCR Detection of the 47 kDa Gene for Reliable Diagnosis of Pediatric Scrub Typhus
Diagnosis of scrub typhus is difficult, because of its non-specific clinical features and a lack of reliable serological assays. This study assessed real time PCR detection of the 47 kDa gene along with routine IgM ELISA for confirmatory diagnosis of scrub typhus. This observational study was conducted between March 2021 and February 2023. Real time PCR detection of the 47 kDa gene was used as a diagnostic assay. Confirmed scrub typhus cases were defined by IgM ELISA and/or real time PCR positivity. A total of 105 children (0–18 years of age) were enrolled, 30 (28.57%) of whom were confirmed to have scrub typhus, according to IgM ELISA and/or real time PCR positivity. The clinical symptoms of scrub typhus positive children included fever, diarrhea, abdominal pain, breathing difficulties, lymphadenopathy, rash, headache, edema, seizure, and poor intake. Good statistical correlations were observed among real time scrub positivity, high grade fever of 8.2 days’ duration, the presence of eschar and thrombocytopenia. Non-specific respiratory symptoms; cough, and cold were more severe in patients with scrub typhus. The real time PCR test showed good specificity in discriminating scrub typhus from enteric fever, a common pediatric disease. Detection of the 47 kDa gene with real time PCR has good sensitivity and specificity. The number of days between fever onset and sampling, as well as the administration of antibiotics before sampling, influence the performance of real-time PCR
High rates of adherence and treatment success in a public and public-private HIV clinic in India: potential benefits of standardized national care delivery systems
<p>Abstract</p> <p>Background</p> <p>The massive scale-up of antiretroviral treatment (ART) access worldwide has brought tremendous benefit to populations affected by HIV/AIDS. Optimising HIV care in countries with diverse medical systems is critical; however data on best practices for HIV healthcare delivery in resource-constrained settings are limited. This study aimed to understand patient characteristics and treatment outcomes from different HIV healthcare settings in Bangalore, India.</p> <p>Methods</p> <p>Participants from public, private and public-private HIV healthcare settings were recruited between 2007 and 2009 and were administered structured interviews by trained staff. Self-reported adherence was measured using the visual analogue scale to capture adherence over the past month, and a history of treatment interruptions (defined as having missed medications for more than 48 hours in the past three months). In addition, CD4 count and viral load (VL) were measured; genotyping for drug resistance-associated mutations was performed on those who were in virological failure (VL > 1000 copies/ml).</p> <p>Results</p> <p>A total of 471 individuals were included in the analysis (263 from the public facility, 149 from the public-private facility and 59 from the private center). Private facility patients were more likely to be male, with higher education levels and incomes. More participants reported ≥ 95% adherence among public and public-private groups compared to private participants (public 97%; private 88%; public-private 93%, p < 0.05). Treatment interruptions were lowest among public participants (1%, 10%, 5% respectively, p < 0.001). Although longer clinic waiting times were experienced by more public participants (48%, compared to private 27%, public-private 19%, p < 0.001), adherence barriers were highest among private (31%) compared with public (10%) and public-private (17%, p < 0.001) participants. Viral load was detectable in 13% public, 22% private and 9% public-private participants (p < 0.05) suggesting fewer treatment failures among public and public-private settings. Drug resistance mutations were found more frequently among private facility patients (20%) compared to those from the public (9%) or public-private facility (8%, p < 0.05).</p> <p>Conclusions</p> <p>Adherence and treatment success was significantly higher among patients from public and public-private settings compared with patients from private facilities. These results suggest a possible benefit of the standardized care delivery system established in public and public-private health facilities where counselling by a multi-disciplinary team of workers is integral to provision of ART. Strengthening and increasing public-private partnerships can enhance the success of national ART programs.</p
The Global Alliance for Infections in Surgery : defining a model for antimicrobial stewardship-results from an international cross-sectional survey
Background: Antimicrobial Stewardship Programs (ASPs) have been promoted to optimize antimicrobial usage and patient outcomes, and to reduce the emergence of antimicrobial-resistant organisms. However, the best strategies for an ASP are not definitively established and are likely to vary based on local culture, policy, and routine clinical practice, and probably limited resources in middle-income countries. The aim of this study is to evaluate structures and resources of antimicrobial stewardship teams (ASTs) in surgical departments from different regions of the world. Methods: A cross-sectional web-based survey was conducted in 2016 on 173 physicians who participated in the AGORA (Antimicrobials: A Global Alliance for Optimizing their Rational Use in Intra-Abdominal Infections) project and on 658 international experts in the fields of ASPs, infection control, and infections in surgery. Results: The response rate was 19.4%. One hundred fifty-six (98.7%) participants stated their hospital had a multidisciplinary AST. The median number of physicians working inside the team was five [interquartile range 4-6]. An infectious disease specialist, a microbiologist and an infection control specialist were, respectively, present in 80.1, 76.3, and 67.9% of the ASTs. A surgeon was a component in 59.0% of cases and was significantly more likely to be present in university hospitals (89.5%, p <0.05) compared to community teaching (83.3%) and community hospitals (66.7%). Protocols for pre-operative prophylaxis and for antimicrobial treatment of surgical infections were respectively implemented in 96.2 and 82.3% of the hospitals. The majority of the surgical departments implemented both persuasive and restrictive interventions (72.8%). The most common types of interventions in surgical departments were dissemination of educational materials (62.5%), expert approval (61.0%), audit and feedback (55.1%), educational outreach (53.7%), and compulsory order forms (51.5%). Conclusion: The survey showed a heterogeneous organization of ASPs worldwide, demonstrating the necessity of a multidisciplinary and collaborative approach in the battle against antimicrobial resistance in surgical infections, and the importance of educational efforts towards this goal.Peer reviewe
Antimicrobials: a global alliance for optimizing their rational use in intra-abdominal infections (AGORA)
Intra-abdominal infections (IAI) are an important cause of morbidity and are frequently associated with poor prognosis, particularly in high-risk patients. The cornerstones in the management of complicated IAIs are timely effective source control with appropriate antimicrobial therapy. Empiric antimicrobial therapy is important in the management of intra-abdominal infections and must be broad enough to cover all likely organisms because inappropriate initial antimicrobial therapy is associated with poor patient outcomes and the development of bacterial resistance. The overuse of antimicrobials is widely accepted as a major driver of some emerging infections (such as C. difficile), the selection of resistant pathogens in individual patients, and for the continued development of antimicrobial resistance globally. The growing emergence of multi-drug resistant organisms and the limited development of new agents available to counteract them have caused an impending crisis with alarming implications, especially with regards to Gram-negative bacteria. An international task force from 79 different countries has joined this project by sharing a document on the rational use of antimicrobials for patients with IAIs. The project has been termed AGORA (Antimicrobials: A Global Alliance for Optimizing their Rational Use in Intra-Abdominal Infections). The authors hope that AGORA, involving many of the world's leading experts, can actively raise awareness in health workers and can improve prescribing behavior in treating IAIs
Burden of disease scenarios for 204 countries and territories, 2022–2050: a forecasting analysis for the Global Burden of Disease Study 2021
Background: Future trends in disease burden and drivers of health are of great interest to policy makers and the public at large. This information can be used for policy and long-term health investment, planning, and prioritisation. We have expanded and improved upon previous forecasts produced as part of the Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) and provide a reference forecast (the most likely future), and alternative scenarios assessing disease burden trajectories if selected sets of risk factors were eliminated from current levels by 2050. Methods: Using forecasts of major drivers of health such as the Socio-demographic Index (SDI; a composite measure of lag-distributed income per capita, mean years of education, and total fertility under 25 years of age) and the full set of risk factor exposures captured by GBD, we provide cause-specific forecasts of mortality, years of life lost (YLLs), years lived with disability (YLDs), and disability-adjusted life-years (DALYs) by age and sex from 2022 to 2050 for 204 countries and territories, 21 GBD regions, seven super-regions, and the world. All analyses were done at the cause-specific level so that only risk factors deemed causal by the GBD comparative risk assessment influenced future trajectories of mortality for each disease. Cause-specific mortality was modelled using mixed-effects models with SDI and time as the main covariates, and the combined impact of causal risk factors as an offset in the model. At the all-cause mortality level, we captured unexplained variation by modelling residuals with an autoregressive integrated moving average model with drift attenuation. These all-cause forecasts constrained the cause-specific forecasts at successively deeper levels of the GBD cause hierarchy using cascading mortality models, thus ensuring a robust estimate of cause-specific mortality. For non-fatal measures (eg, low back pain), incidence and prevalence were forecasted from mixed-effects models with SDI as the main covariate, and YLDs were computed from the resulting prevalence forecasts and average disability weights from GBD. Alternative future scenarios were constructed by replacing appropriate reference trajectories for risk factors with hypothetical trajectories of gradual elimination of risk factor exposure from current levels to 2050. The scenarios were constructed from various sets of risk factors: environmental risks (Safer Environment scenario), risks associated with communicable, maternal, neonatal, and nutritional diseases (CMNNs; Improved Childhood Nutrition and Vaccination scenario), risks associated with major non-communicable diseases (NCDs; Improved Behavioural and Metabolic Risks scenario), and the combined effects of these three scenarios. Using the Shared Socioeconomic Pathways climate scenarios SSP2-4.5 as reference and SSP1-1.9 as an optimistic alternative in the Safer Environment scenario, we accounted for climate change impact on health by using the most recent Intergovernmental Panel on Climate Change temperature forecasts and published trajectories of ambient air pollution for the same two scenarios. Life expectancy and healthy life expectancy were computed using standard methods. The forecasting framework includes computing the age-sex-specific future population for each location and separately for each scenario. 95% uncertainty intervals (UIs) for each individual future estimate were derived from the 2·5th and 97·5th percentiles of distributions generated from propagating 500 draws through the multistage computational pipeline. Findings: In the reference scenario forecast, global and super-regional life expectancy increased from 2022 to 2050, but improvement was at a slower pace than in the three decades preceding the COVID-19 pandemic (beginning in 2020). Gains in future life expectancy were forecasted to be greatest in super-regions with comparatively low life expectancies (such as sub-Saharan Africa) compared with super-regions with higher life expectancies (such as the high-income super-region), leading to a trend towards convergence in life expectancy across locations between now and 2050. At the super-region level, forecasted healthy life expectancy patterns were similar to those of life expectancies. Forecasts for the reference scenario found that health will improve in the coming decades, with all-cause age-standardised DALY rates decreasing in every GBD super-region. The total DALY burden measured in counts, however, will increase in every super-region, largely a function of population ageing and growth. We also forecasted that both DALY counts and age-standardised DALY rates will continue to shift from CMNNs to NCDs, with the most pronounced shifts occurring in sub-Saharan Africa (60·1% [95% UI 56·8–63·1] of DALYs were from CMNNs in 2022 compared with 35·8% [31·0–45·0] in 2050) and south Asia (31·7% [29·2–34·1] to 15·5% [13·7–17·5]). This shift is reflected in the leading global causes of DALYs, with the top four causes in 2050 being ischaemic heart disease, stroke, diabetes, and chronic obstructive pulmonary disease, compared with 2022, with ischaemic heart disease, neonatal disorders, stroke, and lower respiratory infections at the top. The global proportion of DALYs due to YLDs likewise increased from 33·8% (27·4–40·3) to 41·1% (33·9–48·1) from 2022 to 2050, demonstrating an important shift in overall disease burden towards morbidity and away from premature death. The largest shift of this kind was forecasted for sub-Saharan Africa, from 20·1% (15·6–25·3) of DALYs due to YLDs in 2022 to 35·6% (26·5–43·0) in 2050. In the assessment of alternative future scenarios, the combined effects of the scenarios (Safer Environment, Improved Childhood Nutrition and Vaccination, and Improved Behavioural and Metabolic Risks scenarios) demonstrated an important decrease in the global burden of DALYs in 2050 of 15·4% (13·5–17·5) compared with the reference scenario, with decreases across super-regions ranging from 10·4% (9·7–11·3) in the high-income super-region to 23·9% (20·7–27·3) in north Africa and the Middle East. The Safer Environment scenario had its largest decrease in sub-Saharan Africa (5·2% [3·5–6·8]), the Improved Behavioural and Metabolic Risks scenario in north Africa and the Middle East (23·2% [20·2–26·5]), and the Improved Nutrition and Vaccination scenario in sub-Saharan Africa (2·0% [–0·6 to 3·6]). Interpretation: Globally, life expectancy and age-standardised disease burden were forecasted to improve between 2022 and 2050, with the majority of the burden continuing to shift from CMNNs to NCDs. That said, continued progress on reducing the CMNN disease burden will be dependent on maintaining investment in and policy emphasis on CMNN disease prevention and treatment. Mostly due to growth and ageing of populations, the number of deaths and DALYs due to all causes combined will generally increase. By constructing alternative future scenarios wherein certain risk exposures are eliminated by 2050, we have shown that opportunities exist to substantially improve health outcomes in the future through concerted efforts to prevent exposure to well established risk factors and to expand access to key health interventions
WHO global research priorities for antimicrobial resistance in human health
The WHO research agenda for antimicrobial resistance (AMR) in human health has identified 40 research priorities to be addressed by the year 2030. These priorities focus on bacterial and fungal pathogens of crucial importance in addressing AMR, including drug-resistant pathogens causing tuberculosis. These research priorities encompass the entire people-centred journey, covering prevention, diagnosis, and treatment of antimicrobial-resistant infections, in addition to addressing the overarching knowledge gaps in AMR epidemiology, burden and drivers, policies and regulations, and awareness and education. The research priorities were identified through a multistage process, starting with a comprehensive scoping review of knowledge gaps, with expert inputs gathered through a survey and open call. The priority setting involved a rigorous modified Child Health and Nutrition Research Initiative approach, ensuring global representation and applicability of the findings. The ultimate goal of this research agenda is to encourage research and investment in the generation of evidence to better understand AMR dynamics and facilitate policy translation for reducing the burden and consequences of AMR
Recommended from our members
Evaluation of WHO immunologic criteria for treatment failure: implications for detection of virologic failure, evolution of drug resistance and choice of second‐line therapy in India
IntroductionRoutine HIV viral load (VL) testing is not available in India. We compared test performance characteristics of immunologic failure (IF) against the gold standard of virologic failure (VF), examined evolution of drug resistance among those who stayed on a failing regimen because they did not meet criteria for IF and assessed implications for second-line therapy.MethodsParticipants on first-line highly active antiretroviral therapy (HAART) in Bangalore, India, were monitored for 24 months at six-month intervals, with CD4 count, VL and genotype, if VL>1000 copies/ml. Standard WHO criteria were used to define IF; VF was defined as having two consecutive VL>1000 copies/ml or one VL>10,000 copies/ml. Resistance was assessed using standard International AIDS Society-USA (IAS-USA) recommendations.ResultsOf 522 participants (67.6% male, mean age of 37.5; 85.1% on nevirapine-based and 40.4% on d4T-containing regimens), 57 (10.9%) had VF, 38 (7.3%) had IF and 13 (2.5%) had both VF and IF. The sensitivity of immunologic criteria to detect VF was 22.8%, specificity was 94.6% and positive predictive value was 34.2%. Forty-four participants with VF only continued on their failing first-line regimen; by the end of the study period, 90.9% had M184V, 63.6% had thymidine analogue mutations (TAMs), 34.1% had resistance to tenofovir, and 63.6% had resistance to etravirine.ConclusionsWHO IF criteria have low sensitivity for detecting VF, and the presence of IF poorly predicts VF. Relying on CD4 counts leads to unnecessary switches to second-line HAART and continuation of failing regimens, jeopardizing future therapeutic options. Universal access to VL monitoring would avoid costly switches to second-line HAART and preserve future treatment options
- …