341 research outputs found
Potential for rabies control through dog vaccination in wildlife-abundant communities of Tanzania
Canine vaccination has been successful in controlling rabies in diverse settings worldwide. However, concerns remain that coverage levels which have previously been sufficient might be insufficient in systems where transmission occurs both between and within populations of domestic dogs and other carnivores. To evaluate the effectiveness of vaccination targeted at domestic dogs when wildlife also contributes to transmission, we applied a next-generation matrix model based on contract tracing data from the Ngorongoro and Serengeti Districts in northwest Tanzania. We calculated corresponding values of R0, and determined, for policy purposes, the probabilities that various annual vaccination targets would control the disease, taking into account the empirical uncertainty in our field data. We found that transition rate estimates and corresponding probabilities of vaccination-based control indicate that rabies transmission in this region is driven by transmission within domestic dogs. Different patterns of rabies transmission between the two districts exist, with wildlife playing a more important part in Ngorongoro and leading to higher recommended coverage levels in that district. Nonetheless, our findings indicate that an annual dog vaccination campaign achieving the WHO-recommended target of 70% will control rabies in both districts with a high level of certainty. Our results support the feasibility of controlling rabies in Tanzania through dog vaccination
Treatment utilization and outcomes in elderly patients with locally advanced esophageal carcinoma: A review of the National Cancer Database
For elderly patients with locally advanced esophageal cancer, therapeutic approaches and outcomes in a modern cohort are not well characterized. Patients ≥70 years old with clinical stage II and III esophageal cancer diagnosed between 1998 and 2012 were identified from the National Cancer Database and stratified based on treatment type. Variables associated with treatment utilization were evaluated using logistic regression and survival evaluated using Cox proportional hazards analysis. Propensity matching (1:1) was performed to help account for selection bias. A total of 21,593 patients were identified. Median and maximum ages were 77 and 90, respectively. Treatment included palliative therapy (24.3%), chemoradiation (37.1%), trimodality therapy (10.0%), esophagectomy alone (5.6%), or no therapy (12.9%). Age ≥80 (OR 0.73), female gender (OR 0.81), Charlson-Deyo comorbidity score ≥2 (OR 0.82), and high-volume centers (OR 0.83) were associated with a decreased likelihood of palliative therapy versus no treatment. Age ≥80 (OR 0.79) and Clinical Stage III (OR 0.33) were associated with a decreased likelihood, while adenocarcinoma histology (OR 1.33) and nonacademic cancer centers (OR 3.9), an increased likelihood of esophagectomy alone compared to definitive chemoradiation. Age ≥80 (OR 0.15), female gender (OR 0.80), and non-Caucasian race (OR 0.63) were associated with a decreased likelihood, while adenocarcinoma histology (OR 2.10) and high-volume centers (OR 2.34), an increased likelihood of trimodality therapy compared to definitive chemoradiation. Each treatment type demonstrated improved survival compared to no therapy: palliative treatment (HR 0.49) to trimodality therapy (HR 0.25) with significance between all groups. Any therapy, including palliative care, was associated with improved survival; however, subsets of elderly patients with locally advanced esophageal cancer are less likely to receive aggressive therapy. Care should be taken to not unnecessarily deprive these individuals of treatment that may improve survival
Duration of adjuvant chemotherapy for stage III colon cancer
BACKGROUND
Since 2004, a regimen of 6 months of treatment with oxaliplatin plus a fluoropyrimidine has been standard adjuvant therapy in patients with stage III colon cancer. However, since oxaliplatin is associated with cumulative neurotoxicity, a shorter duration of therapy could spare toxic effects and health expenditures.
METHODS
We performed a prospective, preplanned, pooled analysis of six randomized, phase 3 trials that were conducted concurrently to evaluate the noninferiority of adjuvant therapy with either FOLFOX (fluorouracil, leucovorin, and oxaliplatin) or CAPOX (capecitabine and oxaliplatin) administered for 3 months, as compared with 6 months. The primary end point was the rate of disease-free survival at 3 years. Noninferiority of 3 months versus 6 months of therapy could be claimed if the upper limit of the two-sided 95% confidence interval of the hazard ratio did not exceed 1.12.
RESULTS
After 3263 events of disease recurrence or death had been reported in 12,834 patients, the noninferiority of 3 months of treatment versus 6 months was not confirmed in the overall study population (hazard ratio, 1.07; 95% confidence interval [CI], 1.00 to 1.15). Noninferiority of the shorter regimen was seen for CAPOX (hazard ratio, 0.95; 95% CI, 0.85 to 1.06) but not for FOLFOX (hazard ratio, 1.16; 95% CI, 1.06 to 1.26). In an exploratory analysis of the combined regimens, among the patients with T1, T2, or T3 and N1 cancers, 3 months of therapy was noninferior to 6 months, with a 3-year rate of disease-free survival of 83.1% and 83.3%, respectively (hazard ratio, 1.01; 95% CI, 0.90 to 1.12). Among patients with cancers that were classified as T4, N2, or both, the disease-free survival rate for a 6-month duration of therapy was superior to that for a 3-month duration (64.4% vs. 62.7%) for the combined treatments (hazard ratio, 1.12; 95% CI, 1.03 to 1.23; P=0.01 for superiority).
CONCLUSIONS
Among patients with stage III colon cancer receiving adjuvant therapy with FOLFOX or CAPOX, noninferiority of 3 months of therapy, as compared with 6 months, was not confirmed in the overall population. However, in patients treated with CAPOX, 3 months of therapy was as effective as 6 months, particularly in the lower-risk subgroup. (Funded by the National Cancer Institute and others.
Harnessing case isolation and ring vaccination to control Ebola.
As a devastating Ebola outbreak in West Africa continues, non-pharmaceutical control measures including contact tracing, quarantine, and case isolation are being implemented. In addition, public health agencies are scaling up efforts to test and deploy candidate vaccines. Given the experimental nature and limited initial supplies of vaccines, a mass vaccination campaign might not be feasible. However, ring vaccination of likely case contacts could provide an effective alternative in distributing the vaccine. To evaluate ring vaccination as a strategy for eliminating Ebola, we developed a pair approximation model of Ebola transmission, parameterized by confirmed incidence data from June 2014 to January 2015 in Liberia and Sierra Leone. Our results suggest that if a combined intervention of case isolation and ring vaccination had been initiated in the early fall of 2014, up to an additional 126 cases in Liberia and 560 cases in Sierra Leone could have been averted beyond case isolation alone. The marginal benefit of ring vaccination is predicted to be greatest in settings where there are more contacts per individual, greater clustering among individuals, when contact tracing has low efficacy or vaccination confers post-exposure protection. In such settings, ring vaccination can avert up to an additional 8% of Ebola cases. Accordingly, ring vaccination is predicted to offer a moderately beneficial supplement to ongoing non-pharmaceutical Ebola control efforts
Recommended from our members
Projecting hospital utilization during the COVID-19 outbreaks in the United States
Data deposition: The computational system is available in Github (https://github.com/affans/ncov2019odemodel).In the wake of community coronavirus disease 2019 (COVID-19) transmission in the United States, there is a growing public health concern regarding the adequacy of resources to treat infected cases. Hospital beds, intensive care units (ICUs), and ventilators are vital for the treatment of patients with severe illness. To project the timing of the outbreak peak and the number of ICU beds required at peak, we simulated a COVID-19 outbreak parameterized with the US population demographics. In scenario analyses, we varied the delay from symptom onset to self-isolation, the proportion of symptomatic individuals practicing self-isolation, and the basic reproduction number R0. Without self-isolation, when R0 =2.5, treatment of critically ill individuals at the outbreak peak would require 3.8 times more ICU beds than exist in the United States. Self-isolation by 20% of cases 24 h after symptom onset would delay and flatten the outbreak trajectory, reducing the number of ICU beds needed at the peak by 48.4% (interquartile range 46.4-50.3%), although still exceeding existing capacity. When R0 =2, twice as many ICU beds would be required at the peak of outbreak in the absence of self-isolation. In this scenario, the proportional impact of self-isolation within 24 h on reducing the peak number of ICU beds is substantially higher at 73.5% (interquartile range 71.4-75.3%). Our estimates underscore the inadequacy of critical care capacity to handle the burgeoning outbreak. Policies that encourage self-isolation, such as paid sick leave, may delay the epidemic peak, giving a window of time that could facilitate emergency mobilization to expand hospital capacity.S.M.M. acknowledges support from the Canadian Institutes of Health Research (grant OV4-170643; Canadian 2019 Novel Coronavirus Rapid Research), and the Natural Sciences and Engineering Research Council of Canada. A.P.G. gratefully acknowledges funding from the NIH (grant UO1-GM087719), the Burnett and Stender families’ endowment, the Notsew Orm Sands Foundation, NIH grant 1R01AI151176-01, and National Science Foundation grant RAPID-2027755. M.C.F. was supported by the NIH grant K01 AI141576.Integrative Biolog
Expression Profiling of Non-Aflatoxigenic Aspergillus parasiticus Mutants Obtained by 5-Azacytosine Treatment or Serial Mycelial Transfer
Aflatoxins are carcinogenic secondary metabolites produced by the fungi Aspergillus flavus and Aspergillus parasiticus. Previous studies found that repeated serial mycelial transfer or treatment of A. parasiticus with 5-azacytidine produced colonies with a fluffy phenotype and inability to produce aflatoxins. To understand how these treatments affect expression of genes involved in aflatoxin production and development, we carried out expressed sequence tag (EST)-based microarray assays to identify genes in treated clones that are differentially expressed compared to the wild-type. Expression of 183 genes was significantly dysregulated. Of these, 38 had at least two-fold or lower expression compared to the untreated control and only two had two-fold or higher expression. The most frequent change was downregulation of genes predicted to encode membrane-bound proteins. Based on this result we hypothesize that the treatments cause changes in the structure of cellular and organelle membranes that prevent normal development and aflatoxin biosynthesis
CMB-S4 Science Book, First Edition
This book lays out the scientific goals to be addressed by the
next-generation ground-based cosmic microwave background experiment, CMB-S4,
envisioned to consist of dedicated telescopes at the South Pole, the high
Chilean Atacama plateau and possibly a northern hemisphere site, all equipped
with new superconducting cameras. CMB-S4 will dramatically advance cosmological
studies by crossing critical thresholds in the search for the B-mode
polarization signature of primordial gravitational waves, in the determination
of the number and masses of the neutrinos, in the search for evidence of new
light relics, in constraining the nature of dark energy, and in testing general
relativity on large scales
Clinical performance of a multiparametric MRI-based post concussive syndrome index
IntroductionDiffusion Tensor Imaging (DTI) has revealed measurable changes in the brains of patients with persistent post-concussive syndrome (PCS). Because of inconsistent results in univariate DTI metrics among patients with mild traumatic brain injury (mTBI), there is currently no single objective and reliable MRI index for clinical decision-making in patients with PCS.PurposeThis study aimed to evaluate the performance of a newly developed PCS Index (PCSI) derived from machine learning of multiparametric magnetic resonance imaging (MRI) data to classify and differentiate subjects with mTBI and PCS history from those without a history of mTBI.Materials and methodsData were retrospectively extracted from 139 patients aged between 18 and 60 years with PCS who underwent MRI examinations at 2 weeks to 1-year post-mTBI, as well as from 336 subjects without a history of head trauma. The performance of the PCS Index was assessed by comparing 69 patients with a clinical diagnosis of PCS with 264 control subjects. The PCSI values for patients with PCS were compared based on the mechanism of injury, time interval from injury to MRI examination, sex, history of prior concussion, loss of consciousness, and reported symptoms.ResultsInjured patients had a mean PCSI value of 0.57, compared to the control group, which had a mean PCSI value of 0.12 (p = 8.42e-23) with accuracy of 88%, sensitivity of 64%, and specificity of 95%, respectively. No statistically significant differences were found in the PCSI values when comparing the mechanism of injury, sex, or loss of consciousness.ConclusionThe PCSI for individuals aged between 18 and 60 years was able to accurately identify patients with post-concussive injuries from 2 weeks to 1-year post-mTBI and differentiate them from the controls. The results of this study suggest that multiparametric MRI-based PCSI has great potential as an objective clinical tool to support the diagnosis, treatment, and follow-up care of patients with post-concussive syndrome. Further research is required to investigate the replicability of this method using other types of clinical MRI scanners
Detrimental Effects of Environmental Tobacco Smoke in Relation to Asthma Severity
Background: Environmental tobacco smoke (ETS) has adverse effects on the health of asthmatics, however the harmful consequences of ETS in relation to asthma severity are unknown. Methods: In a multicenter study of severe asthma, we assessed the impact of ETS exposure on morbidity, health care utilization and lung functions; and activity of systemic superoxide dismutase (SOD), a potential oxidative target of ETS that is negatively associated with asthma severity. Findings: From 2002-2006, 654 asthmatics (non-severe 366, severe 288) were enrolled, among whom 109 non-severe and 67 severe asthmatics were routinely exposed to ETS as ascertained by history and validated by urine cotinine levels. ETS-exposure was associated with lower quality of life scores; greater rescue inhaler use; lower lung function; greater bronchodilator responsiveness; and greater risk for emergency room visits, hospitalization and intensive care unit admission. ETS-exposure was associated with lower levels of serum SOD activity, particularly in asthmatic women of African heritage. Interpretation: ETS-exposure of asthmatic individuals is associated with worse lung function, higher acuity of exacerbations, more health care utilization, and greater bronchial hyperreactivity. The association of diminished systemic SOD activity to ETS exposure provides for the first time a specific oxidant mechanism by which ETS may adversely affect patients with asthma. © 2011 Comhair et al
- …