19 research outputs found
Defining Quality Indicators for Breast Device Surgery: Using Registries for Global Benchmarking
Background: Breast device registries monitor devices encompassing breast implants, tissue expanders and dermal matrices, and the quality of care and patient outcomes for breast device surgery. Defining a standard set of quality indicators and risk adjustment factors will enable consistency and adjustment for case-mix in benchmarking quality of care across breast implant registries. This study aimed to develop a set of quality indicators to enable assessment and reporting of quality of care for breast device surgery which can be applied globally. Methods: A scoping literature review was undertaken, and potential quality indicators were identified. Consensus on the final list of quality indicators was obtained using a modified Delphi approach. This process involved a series of online surveys, and teleconferences over 6 months. The Delphi panel included participants from various countries and representation from surgical specialty groups including breast and general surgeons, plastic and reconstructive surgeons, cosmetic surgeons, a breast-care nurse, a consumer, a devices regulator (Therapeutic Goods Administration), and a biostatistician. A total of 12 candidate indicators were proposed: Intraoperative antibiotic wash, intraoperative antiseptic wash, preoperative antibiotics, nipple shields, surgical plane, volume of implant, funnels, immediate versus delayed reconstruction, time to revision, reoperation due to complications, patient satisfaction, and volume of activity. Results: Three of the 12 proposed indicators were endorsed by the panel: preoperative intravenous antibiotics, reoperation due to complication, and patient reported outcome measures. Conclusion: The 3 endorsed quality indicator measures will enable breast device registries to standardize benchmarking of care internationally for patients undergoing breast device surgery
Finishing the euchromatic sequence of the human genome
The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead
Plasma Angiotensin Converting Enzyme 2 (ACE2) Activity in Healthy Controls and Patients with Cardiovascular Risk Factors and/or Disease
Angiotensin converting enzyme 2 (ACE2) is an endogenous negative regulator of the renin-angiotensin system, a key factor in the development of cardiovascular disease (CVD). ACE2 is also used by SARS-CoV-2 for host cell entry. Given that COVID-19 is associated with hypercoagulability, it is timely to explore the potential relationship between plasma ACE2 activity and the coagulation profile. In this cross-sectional study, ACE2 activity and global coagulation assays (GCA) including thromboelastography, thrombin, and fibrin generation were measured in adult healthy controls (n = 123; mean age 41 ± 17 years; 35% male) and in patients with cardiovascular risk factors and/or disease (n = 258; mean age 65 ± 14 years; 55% male). ACE2 activity was significantly lower in controls compared to patients with cardiovascular risk factors and/or disease (median 0.10 (0.02, 3.33) vs. 5.99 (1.95, 10.37) pmol/mL/min, p < 0.001). Of the healthy controls, 48% had undetectable ACE2 activity. Controls with detectable ACE2 had lower maximum amplitude (p < 0.001). In patients with cardiovascular risk factors and/or disease, those in the 3rd tertile were older and male (p = 0.002), with a higher Framingham grade and increased number of cardiovascular risk factors (p < 0.001). In conclusion, plasma ACE2 activity is undetectable to very low in young healthy controls with minimal clinically relevant associations to GCA. Patients with cardiovascular risk factors and/or disease have increased plasma ACE2 activity, suggesting that it may be an important biomarker of endothelial dysfunction and atherosclerosis
Characterising risk of in-hospital mortality following cardiac arrest using machine learning:A retrospective international registry study
BackgroundResuscitated cardiac arrest is associated with high mortality; however, the ability to estimate risk of adverse outcomes using existing illness severity scores is limited. Using in-hospital data available within the first 24 hours of admission, we aimed to develop more accurate models of risk prediction using both logistic regression (LR) and machine learning (ML) techniques, with a combination of demographic, physiologic, and biochemical information.Methods and findingsPatient-level data were extracted from the Australian and New Zealand Intensive Care Society (ANZICS) Adult Patient Database for patients who had experienced a cardiac arrest within 24 hours prior to admission to an intensive care unit (ICU) during the period January 2006 to December 2016. The primary outcome was in-hospital mortality. The models were trained and tested on a dataset (split 90:10) including age, lowest and highest physiologic variables during the first 24 hours, and key past medical history. LR and 5 ML approaches (gradient boosting machine [GBM], support vector classifier [SVC], random forest [RF], artificial neural network [ANN], and an ensemble) were compared to the APACHE III and Australian and New Zealand Risk of Death (ANZROD) predictions. In all, 39,566 patients from 186 ICUs were analysed. Mean (±SD) age was 61 ± 17 years; 65% were male. Overall in-hospital mortality was 45.5%. Models were evaluated in the test set. The APACHE III and ANZROD scores demonstrated good discrimination (area under the receiver operating characteristic curve [AUROC] = 0.80 [95% CI 0.79–0.82] and 0.81 [95% CI 0.8–0.82], respectively) and modest calibration (Brier score 0.19 for both), which was slightly improved by LR (AUROC = 0.82 [95% CI 0.81–0.83], DeLong test, p 0.001). Discrimination was significantly improved using ML models (ensemble and GBM AUROCs = 0.87 [95% CI 0.86–0.88], DeLong test, p 0.001), with an improvement in performance (Brier score reduction of 22%). Explainability models were created to assist in identifying the physiologic features that most contributed to an individual patient’s survival. Key limitations include the absence of pre-hospital data and absence of external validation.ConclusionsML approaches significantly enhance predictive discrimination for mortality following cardiac arrest compared to existing illness severity scores and LR, without the use of pre-hospital data. The discriminative ability of these ML models requires validation in external cohorts to establish generalisability.</div
Distinct Effects of a High Fat Diet on Bone in Skeletally Mature and Developing Male C57BL/6J Mice
Increased risks of skeletal fractures are common in patients with impaired glucose handling and type 2 diabetes mellitus (T2DM). The pathogenesis of skeletal fragility in these patients remains ill-defined as patients present with normal to high bone mineral density. With increasing cases of glucose intolerance and T2DM it is imperative that we develop an accurate rodent model for further investigation. We hypothesized that a high fat diet (60%) administered to developing male C57BL/6J mice that had not reached skeletal maturity would over represent bone microarchitectural implications, and that skeletally mature mice would better represent adult-onset glucose intolerance and the pre-diabetes phenotype. Two groups of developing (8 week) and mature (12 week) male C57BL/6J mice were placed onto either a normal chow (NC) or high fat diet (HFD) for 10 weeks. Oral glucose tolerance tests were performed throughout the study period. Long bones were excised and analysed for ex vivo biomechanical testing, micro-computed tomography, 2D histomorphometry and gene/protein expression analyses. The HFD increased fasting blood glucose and significantly reduced glucose tolerance in both age groups by week 7 of the diets. The HFD reduced biomechanical strength, both cortical and trabecular indices in the developing mice, but only affected cortical outcomes in the mature mice. Similar results were reflected in the 2D histomorphometry. Tibial gene expression revealed decreased bone formation in the HFD mice of both age groups, i.e., decreased osteocalcin expression and increased sclerostin RNA expression. In the mature mice only, while the HFD led to a non-significant reduction in runt-related transcription factor 2 (Runx2) RNA expression, this decrease became significant at the protein level in the femora. Our mature HFD mouse model more accurately represents late-onset impaired glucose tolerance/pre-T2DM cases in humans and can be used to uncover potential insights into reduced bone formation as a mechanism of skeletal fragility in these patients
Primary tumor resection and overall survival in patients with metastatic colorectal cancer treated with palliative intent
The survival impact of primary tumor resection in patients with metastatic colorectal cancer (mCRC) treated with palliative intent remains uncertain. In the absence of randomized data, the objectives of the present study were to examine the effect of primary tumor resection (PTR) and major prognostic variables on overall survival (OS) of patients with de novo mCRC. Patients and Methods: Consecutive patients from the Australian \u27Treatment of Recurrent and Advanced Colorectal Cancer\u27 registry were examined from June 2009 to March 2015. Univariate and multivariate Cox proportional hazards regression analyses were used to identify associations between multiple patient or clinical variables and OS. Patients with metachronous mCRC were excluded from the analyses. Results: A total of 690 patients de novo and 373 metachronous mCRC patients treated with palliative intent were identified. The median follow-up period was 30 months. The median age of de novo patients was 66 years; 57% were male; 77% had an Eastern Cooperative Oncology Group performance status of 0 to 1; and 76% had a colon primary. A total of 216 de novo mCRC patients treated with palliative intent underwent PTR at diagnosis and were more likely to have a colon primary (odds ratio [OR], 15.4), a lower carcinoembryonic antigen level (OR, 2.08), and peritoneal involvement (OR, 2.58; P < .001). On multivariate analysis, PTR at diagnosis in de novo patients was not associated with significantly improved OS (hazard ratio [HR], 0.82; 99% confidence interval [CI], 0.62-1.09; P = .068). PTR at diagnosis did not correlate with outcome in de novo patients with a colon primary (HR, 0.74; 99% CI, 0.54-1.01; P = .014) or a rectal primary (HR, 0.81; 99% CI, 0.27-2.44; P = .621). Conclusion: For de novo mCRC patients treated with palliative intent, PTR at diagnosis does not significantly improve OS when adjusting for known major prognostic factors. The outcomes of randomized trials examining the survival impact of PTR are awaited
The impact of bevacizumab in metastatic colorectal cancer with an intact primary tumor: Results from a large prospective cohort study
Debate continues regarding the benefits versus risks of initial resection of the primary tumor in metastatic colorectal cancer (mCRC) patients with an asymptomatic primary tumor. Although the benefit of the anti-vascular endothelial growth factor agent bevacizumab alongside first-line chemotherapy in mCRC is established, the impact of bevacizumab on the intact primary tumor (IPT) is less well understood. Methods: Data from an Australian mCRC registry were used to assess the impact of bevacizumab-based regimens in the presence of an IPT, to see if this differs from effects in resected primary tumor (RPT) patients and to understand the safety profile of bevacizumab in patients with IPT. Progression-free survival (PFS), overall survival (OS) and safety endpoints were analyzed. Results: Of 1204 mCRC patients, 826 (69%) were eligible for inclusion. Bevacizumab use was similar in both arms (IPT (64%) versus RPT (70%)); compared with chemotherapy alone, bevacizumab use was associated with significantly longer PFS (IPT: 8.5 months vs 4.7 months, P = 0.017; RPT: 10.8 months vs 5.8 months, P < 0.001) and OS (IPT: 20 months vs 14.8 months, P = 0.005; RPT: 24.4 months vs 17.3 months, P = 0.004)).1 Bevacizumab use in an IPT was associated with more GI perforations (4.5% vs 1.8%, P = 0.210) but less frequent bleeding (1.5% vs 5.3%, P = 0.050) and thrombosis (1.5% vs 2.7%, P = 0.470), versus chemotherapy alone. Median survival was equivalent between patients that did or did not experience bevacizumab-related adverse events - 20.0 months versus 19.9 months, hazard ratio = 0.98, P = 0.623.1 Conclusions: The addition of bevacizumab significantly improved survival outcomes in mCRC with an IPT. The occurrence of bevacizumab-related adverse events did not significantly impact survival outcomesRoche Products Pty Limited has provided financial assistance for the development, installation and maintenance
of the TRACC registry