52 research outputs found
From theory to practice: improving the impact of health services research
BACKGROUND: While significant strides have been made in health research, the incorporation of research evidence into healthcare decision-making has been marginal. The purpose of this paper is to provide an overview of how the utility of health services research can be improved through the use of theory. Integrating theory into health services research can improve research methodology and encourage stronger collaboration with decision-makers. DISCUSSION: Recognizing the importance of theory calls for new expectations in the practice of health services research. These include: the formation of interdisciplinary research teams; broadening the training for those who will practice health services research; and supportive organizational conditions that promote collaboration between researchers and decision makers. Further, funding bodies can provide a significant role in guiding and supporting the use of theory in the practice of health services research. SUMMARY: Institutions and researchers should incorporate the use of theory if health services research is to fulfill its potential for improving the delivery of health care
Surgical perspectives from a prospective, nonrandomized, multicenter study of breast conserving surgery and adjuvant electronic brachytherapy for the treatment of breast cancer
<p>Abstract</p> <p>Background</p> <p>Accelerated partial breast irradiation (APBI) may be used to deliver radiation to the tumor bed post-lumpectomy in eligible patients with breast cancer. Patient and tumor characteristics as well as the lumpectomy technique can influence patient eligibility for APBI. This report describes a lumpectomy procedure and examines patient, tumor, and surgical characteristics from a prospective, multicenter study of electronic brachytherapy.</p> <p>Methods</p> <p>The study enrolled 65 patients of age 45-84 years with ductal carcinoma or ductal carcinoma in situ, and 44 patients, who met the inclusion and exclusion criteria, were treated with APBI using the Axxent<sup>® </sup>electronic brachytherapy system following lumpectomy. The prescription dose was 34 Gy in 10 fractions over 5 days.</p> <p>Results</p> <p>The lumpectomy technique as described herein varied by site and patient characteristics. The balloon applicator was implanted by the surgeon (91%) or a radiation oncologist (9%) during or up to 61 days post-lumpectomy (mean 22 days). A lateral approach was most commonly used (59%) for insertion of the applicator followed by an incision site approach in 27% of cases, a medial approach in 5%, and an inferior approach in 7%. A trocar was used during applicator insertion in 27% of cases. Local anesthetic, sedation, both or neither were administered in 45%, 2%, 41% and 11% of cases, respectively, during applicator placement. The prescription dose was delivered in 42 of 44 treated patients.</p> <p>Conclusions</p> <p>Early stage breast cancer can be treated with breast conserving surgery and APBI using electronic brachytherapy. Treatment was well tolerated, and these early outcomes were similar to the early outcomes with iridium-based balloon brachytherapy.</p
Effect of aliskiren on post-discharge outcomes among diabetic and non-diabetic patients hospitalized for heart failure: insights from the ASTRONAUT trial
Aims The objective of the Aliskiren Trial on Acute Heart Failure Outcomes (ASTRONAUT) was to determine whether aliskiren, a direct renin inhibitor, would improve post-discharge outcomes in patients with hospitalization for heart failure (HHF) with reduced ejection fraction. Pre-specified subgroup analyses suggested potential heterogeneity in post-discharge outcomes with aliskiren in patients with and without baseline diabetes mellitus (DM). Methods and results ASTRONAUT included 953 patients without DM (aliskiren 489; placebo 464) and 662 patients with DM (aliskiren 319; placebo 343) (as reported by study investigators). Study endpoints included the first occurrence of cardiovascular death or HHF within 6 and 12 months, all-cause death within 6 and 12 months, and change from baseline in N-terminal pro-B-type natriuretic peptide (NT-proBNP) at 1, 6, and 12 months. Data regarding risk of hyperkalaemia, renal impairment, and hypotension, and changes in additional serum biomarkers were collected. The effect of aliskiren on cardiovascular death or HHF within 6 months (primary endpoint) did not significantly differ by baseline DM status (P = 0.08 for interaction), but reached statistical significance at 12 months (non-DM: HR: 0.80, 95% CI: 0.64-0.99; DM: HR: 1.16, 95% CI: 0.91-1.47; P = 0.03 for interaction). Risk of 12-month all-cause death with aliskiren significantly differed by the presence of baseline DM (non-DM: HR: 0.69, 95% CI: 0.50-0.94; DM: HR: 1.64, 95% CI: 1.15-2.33; P < 0.01 for interaction). Among non-diabetics, aliskiren significantly reduced NT-proBNP through 6 months and plasma troponin I and aldosterone through 12 months, as compared to placebo. Among diabetic patients, aliskiren reduced plasma troponin I and aldosterone relative to placebo through 1 month only. There was a trend towards differing risk of post-baseline potassium ≥6 mmol/L with aliskiren by underlying DM status (non-DM: HR: 1.17, 95% CI: 0.71-1.93; DM: HR: 2.39, 95% CI: 1.30-4.42; P = 0.07 for interaction). Conclusion This pre-specified subgroup analysis from the ASTRONAUT trial generates the hypothesis that the addition of aliskiren to standard HHF therapy in non-diabetic patients is generally well-tolerated and improves post-discharge outcomes and biomarker profiles. In contrast, diabetic patients receiving aliskiren appear to have worse post-discharge outcomes. Future prospective investigations are needed to confirm potential benefits of renin inhibition in a large cohort of HHF patients without D
Design and baseline characteristics of the finerenone in reducing cardiovascular mortality and morbidity in diabetic kidney disease trial
Background: Among people with diabetes, those with kidney disease have exceptionally high rates of cardiovascular (CV) morbidity and mortality and progression of their underlying kidney disease. Finerenone is a novel, nonsteroidal, selective mineralocorticoid receptor antagonist that has shown to reduce albuminuria in type 2 diabetes (T2D) patients with chronic kidney disease (CKD) while revealing only a low risk of hyperkalemia. However, the effect of finerenone on CV and renal outcomes has not yet been investigated in long-term trials.
Patients and Methods: The Finerenone in Reducing CV Mortality and Morbidity in Diabetic Kidney Disease (FIGARO-DKD) trial aims to assess the efficacy and safety of finerenone compared to placebo at reducing clinically important CV and renal outcomes in T2D patients with CKD. FIGARO-DKD is a randomized, double-blind, placebo-controlled, parallel-group, event-driven trial running in 47 countries with an expected duration of approximately 6 years. FIGARO-DKD randomized 7,437 patients with an estimated glomerular filtration rate >= 25 mL/min/1.73 m(2) and albuminuria (urinary albumin-to-creatinine ratio >= 30 to <= 5,000 mg/g). The study has at least 90% power to detect a 20% reduction in the risk of the primary outcome (overall two-sided significance level alpha = 0.05), the composite of time to first occurrence of CV death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for heart failure.
Conclusions: FIGARO-DKD will determine whether an optimally treated cohort of T2D patients with CKD at high risk of CV and renal events will experience cardiorenal benefits with the addition of finerenone to their treatment regimen.
Trial Registration: EudraCT number: 2015-000950-39; ClinicalTrials.gov identifier: NCT02545049
Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries
Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely
The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance
INTRODUCTION
Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic.
RATIONALE
We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs).
RESULTS
Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants.
CONCLUSION
Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
- …