31 research outputs found
Imputing HIV treatment start dates from routine laboratory data in South Africa: a validation study
Background: Poor clinical record keeping hinders health systems monitoring and patient care in many low resource settings. We develop and validate a novel method to impute dates of antiretroviral treatment (ART) initiation from routine laboratory data in South Africa’s public sector HIV program. This method will enable monitoring of the national ART program using real-time laboratory data, avoiding the error potential of chart review. Methods: We developed an algorithm to impute ART start dates based on the date of a patient’s “ART workup”, i.e. the laboratory tests used to determine treatment readiness in national guidelines, and the time from ART workup to initiation based on clinical protocols (21 days). To validate the algorithm, we analyzed data from two large clinical HIV cohorts: Hlabisa HIV Treatment and Care Programme in rural KwaZulu-Natal; and Right to Care Cohort in urban Gauteng. Both cohorts contain known ART initiation dates and laboratory results imported directly from the National Health Laboratory Service. We assessed median time from ART workup to ART initiation and calculated sensitivity (SE), specificity (SP), positive predictive value (PPV), and negative predictive value (NPV) of our imputed start date vs. the true start date within a 6 month window. Heterogeneity was assessed across individual clinics and over time. Results: We analyzed data from over 80,000 HIV-positive adults. Among patients who had a workup and initiated ART, median time to initiation was 16 days (IQR 7,31) in Hlabisa and 21 (IQR 8,43) in RTC cohort. Among patients with known ART start dates, SE of the imputed start date was 83% in Hlabisa and 88% in RTC, indicating this method accurately predicts ART start dates for about 85% of all ART initiators. In Hlabisa, PPV was 95%, indicating that for patients with a lab workup, true start dates were predicted with high accuracy. SP (100%) and NPV (92%) were also very high. Conclusions: Routine laboratory data can be used to infer ART initiation dates in South Africa’s public sector. Where care is provided based on protocols, laboratory data can be used to monitor health systems performance and improve accuracy and completeness of clinical records
Clinical accuracy of instrument-based SARS-CoV-2 antigen diagnostic tests:a systematic review and meta-analysis
Background: During the COVID-19 pandemic, antigen diagnostic tests were frequently used for screening, triage, and diagnosis. Novel instrument-based antigen tests (iAg tests) hold the promise of outperforming their instrument-free, visually-read counterparts. Here, we provide a systematic review and meta-analysis of the SARS-CoV-2 iAg tests’ clinical accuracy. Methods: We systematically searched MEDLINE (via PubMed), Web of Science, medRxiv, and bioRxiv for articles published before November 7th, 2022, evaluating the accuracy of iAg tests for SARS-CoV-2 detection. We performed a random effects meta-analysis to estimate sensitivity and specificity and used the QUADAS-2 tool to assess study quality and risk of bias. Sub-group analysis was conducted based on Ct value range, IFU-conformity, age, symptom presence and duration, and the variant of concern. Results: We screened the titles and abstracts of 20,431 articles and included 114 publications that fulfilled the inclusion criteria. Additionally, we incorporated three articles sourced from the FIND website, totaling 117 studies encompassing 95,181 individuals, which evaluated the clinical accuracy of 24 commercial COVID-19 iAg tests. The studies varied in risk of bias but showed high applicability. Of 24 iAg tests from 99 studies assessed in the meta-analysis, the pooled sensitivity and specificity compared to molecular testing of a paired NP swab sample were 76.7% (95% CI 73.5 to 79.7) and 98.4% (95% CI 98.0 to 98.7), respectively. Higher sensitivity was noted in individuals with high viral load (99.6% [95% CI 96.8 to 100] at Ct-level ≤ 20) and within the first week of symptom onset (84.6% [95% CI 78.2 to 89.3]), but did not differ between tests conducted as per manufacturer’s instructions and those conducted differently, or between point-of-care and lab-based testing. Conclusion: Overall, iAg tests have a high pooled specificity but a moderate pooled sensitivity, according to our analysis. The pooled sensitivity increases with lower Ct-values (a proxy for viral load), or within the first week of symptom onset, enabling reliable identification of most COVID-19 cases and highlighting the importance of context in test selection. The study underscores the need for careful evaluation considering performance variations and operational features of iAg tests.</p
Imputing HIV treatment start dates from routine laboratory data in South Africa: a validation study
Clinical Outcomes of Computed Tomography-Based Volumetric Brachytherapy Planning for Cervical Cancer
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (n = 143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (n = 152), or no hydrocortisone (n = 108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (n = 137), shock-dependent (n = 146), and no (n = 101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Impact of indigenous land uses on tree structure, diversity, and species composition in Bosawás Biosphere Reserve, Nicaragua
Tradicionalmente se afirma que las sociedades indígenas que habitan bosques tropicales preservan la biodiversidad y los recursos naturales a través de sus pautas tradicionales de subsistencia. Sin embargo, pocos estudios han cuantificado el impacto relativo de esas pautas de uso del suelo en la estructura y en la biodiversidad de especies forestales. En este estudio se compararon el tamaño, la densidad, la diversidad y la composición de árboles entre las tres principales zonas de uso del suelo (agricultura, caza y conservación) en un territorio indígena de la Reserva de Bosawás. Se identificaron y se midieron los árboles que presentaban un diámetro a la altura del pecho (DAP) ≥ 10 cm en 13 transectos de 1 km de longitud repartidos entre las tres zonas. La primera hipótesis plantea que el tamaño, la densidad y la diversidad de árboles son menores en el mosaico de bosques de la zona agrícola que en los bosques de las zonas de caza y de conservación. Ésta se aceptó sólo para el caso del DAP total por transecto, mientras que la altura media, el DAP medio, la densidad de individuos, la densidad de tallos (algunos individuos constaban de múltiples tallos) y la diversidad estimada no fueron significativamente diferentes entre las tres zonas de uso del suelo. En la segunda hipótesis se considera que la composición de especies arbóreas es diferente en la zona agrícola. Los resultados confirman esta afirmación, no obstante la composición también fue distinta entre las zonas de caza y de conservación, y ésta no estuvo significativamente relacionada con la distancia a las comunidades. A pesar de que la zona agrícola abarca toda la actividad agrícola y la mayor parte de la extracción de productos forestales en los territorios indígenas, el mosaico de bosques secundarios y maduros que constituye la mayor superficie de esta zona fue similar en estructura y diversidad a la zona de conservación, donde el impacto humano es casi nulo. Sin embargo, los usos indígenas del suelo incidieron en la composición de especies, recalcando la necesidad de mantener remanentes de bosque maduro en la zona agrícola para servir como refugios de especies y la de fortalecer las normativas que protegen el bosque en las zonas de caza y de conservación.Indigenous societies inhabiting tropical forests have been recognized for their role in conserving biodiversity and natural resources through traditional patterns of subsistence. However, few studies have quantified the relative impact of these land use patterns on the structure and diversity of forest species. This study compared the size, density, diversity, and species composition of trees between three principal land use zones (agriculture, hunting, and conservation) in an indigenous territory of Bosawas Biosphere Reserve. Trees with a minimum diameter at breast height (DBH) of 10 cm were identified and measured in 13 1-km transects located among the three zones. The first hypothesis was that tree size, density, and diversity are less in the forest mosaic of the agricultural zone than in the forests of the hunting and conservation zones. This hypothesis was accepted in the case of total DBH per transect, while mean height, mean DBH, individual tree density, stem density (some trees had multiple stems), and estimated diversity were not significantly different among the three zones. The second hypothesis that tree species composition in the agricultural zone is distinct from that of the other two zones was accepted. However, composition was also different between the hunting and conservation zones and was not significantly related to distance from the nearest indigenous community. Although all indigenous farming and nearly all forest product extraction occurs in the agricultural zone, the mosaic of secondary and mature forests that constitute the majority of land use cover in this zone exhibited tree structure and diversity similar to that of the conservation zone, where human impact is minuscule. In contrast, indigenous land uses influenced species composition. This last result emphasizes the importance of maintaining remnants of mature forest in the agricultural zone to serve as species refuges and of strengthening traditional norms that insure forest protection in the hunting and conservation zones
Recommended from our members
Cause-specific effects of radiotherapy and lymphadenectomy in stage I-II endometrial cancer: a population-based study.
BACKGROUND: Radiotherapy and lymphadenectomy have been associated with improved survival in population-based studies of endometrial cancer, which is in contrast with findings from randomized trials and meta-analyses. The primary study aim was to estimate the cause-specific effects of adjuvant radiotherapy and lymphadenectomy on competing causes of mortality. METHODS: We analyzed Surveillance, Epidemiology, and End Results (SEER) data from 1988 to 2006. The sample comprised 58172 patients with stage I and II endometrial adenocarcinoma. Patients were risk stratified by stage, grade, and age. Cumulative incidences and cause-specific hazards of competing causes of mortality were estimated according to treatment. All statistical tests were two-sided. RESULTS: Pelvic radiotherapy was associated with statistically significantly increased endometrial cancer mortality (hazard ratio [HR] = 1.66; 95% confidence interval [CI] = 1.52 to 1.82) in all stage I and II patients and decreased noncancer mortality in intermediate and high-risk stage I and II patients (HR = 0.82; 95% CI = 0.77 to 0.89). Lymphadenectomy was associated with increased endometrial cancer mortality in stage I patients (HR = 1.27; 95% CI = 1.16 to 1.39), decreased endometrial cancer mortality in stage II patients (HR = 0.61; 95% CI = 0.52 to 0.72), and decreased noncancer mortality in both stage I and II patients (HR = 0.84; 95% CI = 0.80 to 0.88). Effects of radiotherapy and lymphadenectomy on second cancer mortality varied according to risk strata. CONCLUSIONS: Radiotherapy and lymphadenectomy are associated with statistically significantly reduced noncancer mortality in stage I and II endometrial cancer. The improved overall survival associated with these treatments reported from SEER studies is largely attributable to their selective application in healthier patients rather than their effects on endometrial cancer
Recommended from our members
Cause-specific effects of radiotherapy and lymphadenectomy in stage I-II endometrial cancer: a population-based study.
BackgroundRadiotherapy and lymphadenectomy have been associated with improved survival in population-based studies of endometrial cancer, which is in contrast with findings from randomized trials and meta-analyses. The primary study aim was to estimate the cause-specific effects of adjuvant radiotherapy and lymphadenectomy on competing causes of mortality.MethodsWe analyzed Surveillance, Epidemiology, and End Results (SEER) data from 1988 to 2006. The sample comprised 58172 patients with stage I and II endometrial adenocarcinoma. Patients were risk stratified by stage, grade, and age. Cumulative incidences and cause-specific hazards of competing causes of mortality were estimated according to treatment. All statistical tests were two-sided.ResultsPelvic radiotherapy was associated with statistically significantly increased endometrial cancer mortality (hazard ratio [HR] = 1.66; 95% confidence interval [CI] = 1.52 to 1.82) in all stage I and II patients and decreased noncancer mortality in intermediate and high-risk stage I and II patients (HR = 0.82; 95% CI = 0.77 to 0.89). Lymphadenectomy was associated with increased endometrial cancer mortality in stage I patients (HR = 1.27; 95% CI = 1.16 to 1.39), decreased endometrial cancer mortality in stage II patients (HR = 0.61; 95% CI = 0.52 to 0.72), and decreased noncancer mortality in both stage I and II patients (HR = 0.84; 95% CI = 0.80 to 0.88). Effects of radiotherapy and lymphadenectomy on second cancer mortality varied according to risk strata.ConclusionsRadiotherapy and lymphadenectomy are associated with statistically significantly reduced noncancer mortality in stage I and II endometrial cancer. The improved overall survival associated with these treatments reported from SEER studies is largely attributable to their selective application in healthier patients rather than their effects on endometrial cancer
Recommended from our members
Cause-specific effects of radiotherapy and lymphadenectomy in stage I-II endometrial cancer: a population-based study.
BackgroundRadiotherapy and lymphadenectomy have been associated with improved survival in population-based studies of endometrial cancer, which is in contrast with findings from randomized trials and meta-analyses. The primary study aim was to estimate the cause-specific effects of adjuvant radiotherapy and lymphadenectomy on competing causes of mortality.MethodsWe analyzed Surveillance, Epidemiology, and End Results (SEER) data from 1988 to 2006. The sample comprised 58172 patients with stage I and II endometrial adenocarcinoma. Patients were risk stratified by stage, grade, and age. Cumulative incidences and cause-specific hazards of competing causes of mortality were estimated according to treatment. All statistical tests were two-sided.ResultsPelvic radiotherapy was associated with statistically significantly increased endometrial cancer mortality (hazard ratio [HR] = 1.66; 95% confidence interval [CI] = 1.52 to 1.82) in all stage I and II patients and decreased noncancer mortality in intermediate and high-risk stage I and II patients (HR = 0.82; 95% CI = 0.77 to 0.89). Lymphadenectomy was associated with increased endometrial cancer mortality in stage I patients (HR = 1.27; 95% CI = 1.16 to 1.39), decreased endometrial cancer mortality in stage II patients (HR = 0.61; 95% CI = 0.52 to 0.72), and decreased noncancer mortality in both stage I and II patients (HR = 0.84; 95% CI = 0.80 to 0.88). Effects of radiotherapy and lymphadenectomy on second cancer mortality varied according to risk strata.ConclusionsRadiotherapy and lymphadenectomy are associated with statistically significantly reduced noncancer mortality in stage I and II endometrial cancer. The improved overall survival associated with these treatments reported from SEER studies is largely attributable to their selective application in healthier patients rather than their effects on endometrial cancer