107 research outputs found

    Cardiovascular outcomes and trends of Transcatheter vs. Surgical aortic valve replacement among octogenarians with heart failure: A Propensity Matched national cohort analysis.

    Get PDF
    Background: Heart failure (HF) is a complex clinical syndrome with symptoms and signs that result from any structural or functional impairment of ventricular filling or ejection of blood. Limited data is available regarding the in-hospital outcomes of TAVR compared to SAVR in the octogenarian population with HF. Methods: The National Inpatient Sample (NIS) database was used to compare TAVR versus SAVR among octogenarians with HF. The primary outcome was in-hospital mortality. The secondary outcome included acute kidney injury (AKI), cerebrovascular accident (CVA), post-procedural stroke, major bleeding, blood transfusions, sudden cardiac arrest (SCA), cardiogenic shock (CS), and mechanical circulatory support (MCS). Results: A total of 74,995 octogenarian patients with HF (TAVR-HF n = 64,890 (86.5%); SAVR n = 10,105 (13.5%)) were included. The median age of patients in TAVR-HF and SAVR-HF was 86 (83-89) and 82 (81-84) respectively. TAVR-HF had lower percentage in-hospital mortality (1.8% vs. 6.9%;p < 0.001), CVA (2.5% vs. 3.6%; p = 0.009), SCA (9.9% vs. 20.2%; p < 0.001), AKI (17.4% vs. 40.8%); p < 0.001), major transfusion (26.4% vs 67.3%; p < 0.001), CS (1.8% vs 9.8%; p < 0.001), and MCS (0.8% vs 7.3%; p < 0.001) when compared to SAVR-HF. Additionally, post-procedural stroke and major bleeding showed no significant difference. The median unmatched total charges for TAVR-HF and SAVR-HF were 194,561and246,100 and 246,100 respectively. Conclusion: In this nationwide observational analysis, TAVR is associated with an improved safety profile for octogenarians with heart failure (both preserved and reduced ejection fraction) compared to SAVR

    Is silicon a panacea for alleviating drought and salt stress in crops?

    Get PDF
    Salinity affects around 20% of all arable land while an even larger area suffers from recurrent drought. Together these stresses suppress global crop production by as much as 50% and their impacts are predicted to be exacerbated by climate change. Infrastructure and management practices can mitigate these detrimental impacts, but are costly. Crop breeding for improved tolerance has had some success but is progressing slowly and is not keeping pace with climate change. In contrast, Silicon (Si) is known to improve plant tolerance to a range of stresses and could provide a sustainable, rapid and cost-effective mitigation method. The exact mechanisms are still under debate but it appears Si can relieve salt stress via accumulation in the root apoplast where it reduces “bypass flow of ions to the shoot. Si-dependent drought relief has been linked to lowered root hydraulic conductance and reduction of water loss through transpiration. However, many alternative mechanisms may play a role such as altered gene expression and increased accumulation of compatible solutes. Oxidative damage that occurs under stress conditions can be reduced by Si through increased antioxidative enzymes while Si-improved photosynthesis has also been reported. Si fertilizer can be produced relatively cheaply and to assess its economic viability to improve crop stress tolerance we present a cost-benefit analysis. It suggests that Si fertilization may be beneficial in many agronomic settings but may be beyond the means of smallholder farmers in developing countries. Si application may also have disadvantages, such as increased soil pH, less efficient conversion of crops into biofuel and reduced digestibility of animal fodder. These issues may hamper uptake of Si fertilization as a routine agronomic practice. Here, we critically evaluate recent literature, quantifying the most significant physiological changes associated with Si in plants under drought and salinity stress. Analyses show that metrics associated with photosynthesis, water balance and oxidative stress all improve when Si is present during plant exposure to salinity and drought. We further conclude that most of these changes can be explained by apoplastic roles of Si while there is as yet little evidence to support biochemical roles of this element

    Формирование эмоциональной культуры как компонента инновационной культуры студентов

    Get PDF
    Homozygosity has long been associated with rare, often devastating, Mendelian disorders1 and Darwin was one of the first to recognise that inbreeding reduces evolutionary fitness2. However, the effect of the more distant parental relatedness common in modern human populations is less well understood. Genomic data now allow us to investigate the effects of homozygosity on traits of public health importance by observing contiguous homozygous segments (runs of homozygosity, ROH), which are inferred to be homozygous along their complete length. Given the low levels of genome-wide homozygosity prevalent in most human populations, information is required on very large numbers of people to provide sufficient power3,4. Here we use ROH to study 16 health-related quantitative traits in 354,224 individuals from 102 cohorts and find statistically significant associations between summed runs of homozygosity (SROH) and four complex traits: height, forced expiratory lung volume in 1 second (FEV1), general cognitive ability (g) and educational attainment (nominal p<1 × 10−300, 2.1 × 10−6, 2.5 × 10−10, 1.8 × 10−10). In each case increased homozygosity was associated with decreased trait value, equivalent to the offspring of first cousins being 1.2 cm shorter and having 10 months less education. Similar effect sizes were found across four continental groups and populations with different degrees of genome-wide homozygosity, providing convincing evidence for the first time that homozygosity, rather than confounding, directly contributes to phenotypic variance. Contrary to earlier reports in substantially smaller samples5,6, no evidence was seen of an influence of genome-wide homozygosity on blood pressure and low density lipoprotein (LDL) cholesterol, or ten other cardio-metabolic traits. Since directional dominance is predicted for traits under directional evolutionary selection7, this study provides evidence that increased stature and cognitive function have been positively selected in human evolution, whereas many important risk factors for late-onset complex diseases may not have been

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease

    Get PDF
    Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.

    Cohort Profile: Post-Hospitalisation COVID-19 (PHOSP-COVID) study

    Get PDF

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P &lt; 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P &lt; 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk
    corecore