172 research outputs found
A touchdown nucleic acid amplification protocol as an alternative to culture backup for immunofluorescence in the routine diagnosis of acute viral respiratory tract infections
BACKGROUND: Immunofluorescence and virus culture are the main methods used to diagnose acute respiratory virus infections. Diagnosing these infections using nucleic acid amplification presents technical challenges, one of which is facilitating the different optimal annealing temperatures needed for each virus. To overcome this problem we developed a diagnostic molecular strip which combined a generic nested touchdown protocol with in-house primer master-mixes that could recognise 12 common respiratory viruses. RESULTS: Over an 18 month period a total of 222 specimens were tested by both immunofluorescence and the molecular strip. The specimens came from 103 males (median age 3.5 y), 80 females (median age 9 y) and 5 quality assurance scheme specimens. Viruses were recovered from a number of specimen types including broncho-alveolar lavage, nasopharyngeal secretions, sputa, post-mortem lung tissue and combined throat and nasal swabs. Viral detection by IF was poor in sputa and respiratory swabs. A total of 99 viruses were detected in the study from 79 patients and 4 quality control specimens: 31 by immunofluorescence and 99 using the molecular strip. The strip consistently out-performed immunofluorescence with no loss of diagnostic specificity. CONCLUSIONS: The touchdown protocol with pre-dispensed primer master-mixes was suitable for replacing virus culture for the diagnosis of respiratory viruses which were negative by immunofluorescence. Results by immunofluorescence were available after an average of 4–12 hours while molecular strip results were available within 24 hours, considerably faster than viral culture. The combined strip and touchdown protocol proved to be a convenient and reliable method of testing for multiple viruses in a routine setting
Reducing cognitive arousal and sleep effort alleviates insomnia and depression in pregnant women with DSM-5 insomnia disorder treated with a mindfulness sleep program
OBJECTIVES: Combining mindfulness with behavioral sleep strategies has been found to alleviate symptoms of insomnia and depression during pregnancy, but mechanisms for this treatment approach remain unclear. The present study examined nocturnal cognitive arousal and sleep effort as potential treatment mechanisms for alleviating insomnia and depression via a mindfulness sleep program for pregnant women.
METHODS: Secondary analysis from a proof-of-concept trial of 12 pregnant women with DSM-5 insomnia disorder who were treated with Perinatal Understanding of Mindful Awareness for Sleep (PUMAS), which places behavioral sleep strategies within a mindfulness framework. Data were collected across eight weekly assessments: pretreatment, six sessions, and posttreatment. Measures included the insomnia severity index (ISI), Edinburgh postnatal depression scale (EPDS), pre-sleep arousal scale\u27s cognitive factor (PSASC), and the Glasgow sleep effort scale (GSES). We used linear mixed modeling to test cognitive arousal and sleep effort as concurrent and prospective predictors of insomnia and depression.
RESULTS: Most patients reported high cognitive arousal before PUMAS (75.0%), which decreased to 8.3% after treatment. All insomnia remitters reported low cognitive arousal after treatment, whereas half of nonremitters continued reporting high cognitive arousal. Both nocturnal cognitive arousal and sleep effort were associated with same-week changes in insomnia throughout treatment, and sleep effort yielded a prospective effect on insomnia. Lower levels of nocturnal cognitive arousal and sleep effort prospectively predicted reductions in depression.
CONCLUSIONS: The present study offers preliminary evidence that reducing sleep effort and nocturnal cognitive arousal may serve as key mechanisms for alleviating insomnia and depression via mindfulness-based insomnia therapy. ClinicalTrials.gov ID: NCT04443959
Non-detection of Chlamydia species in carotid atheroma using generic primers by nested PCR in a population with a high prevalence of Chlamydia pneumoniae antibody
BACKGROUND: The association of Chlamydia pneumoniae with atherosclerosis is controversial. We investigated the presence of C. pneumoniae and other Chlamydia spp. in atheromatous carotid artery tissue. METHODS: Forty elective carotid endarterectomy patients were recruited (27 males, mean age 65 and 13 females mean age 68), 4 had bilateral carotid endarterectomies (n= 44 endarterectomy specimens). Control specimens were taken from macroscopically normal carotid artery adjacent to the atheromatous lesions (internal controls), except in 8 cases where normal carotid arteries from post mortem (external controls) were used. Three case-control pairs were excluded when the HLA DRB gene failed to amplify from the DNA. Genus specific primers to the major outer membrane protein (MOMP) gene were used in a nested polymerase chain reaction (nPCR) in 41 atheromatous carotid specimens and paired controls. PCR inhibition was monitored by spiking with target C. trachomatis. Atheroma severity was graded histologically. Plasma samples were tested by microimmunofluorescence (MIF) for antibodies to C. pneumoniae, C. trachomatis and C. psittaci and the corresponding white cells were tested for Chlamydia spp. by nPCR. RESULTS: C. pneumoniae was not detected in any carotid specimen. Twenty-five of 38 (66%) plasma specimens were positive for C. pneumoniae IgG, 2/38 (5%) for C. trachomatis IgG and 1/38 (3%) for C. psittaci IgG. CONCLUSIONS: We were unable to show an association between the presence of Chlamydia spp. and atheroma in carotid arteries in the presence of a high seroprevalence of C. pneumoniae antibodies in Northern Ireland
Remodeling the Proteostasis Network to Rescue Glucocerebrosidase Variants by Inhibiting ER-Associated Degradation and Enhancing ER Folding
Gaucher’s disease (GD) is characterized by loss of lysosomal glucocerebrosidase (GC) activity. Mutations in the gene encoding GC destabilize the protein’s native folding leading to ER-associated degradation (ERAD) of the misfolded enzyme. Enhancing the cellular folding capacity by remodeling the proteostasis network promotes native folding and lysosomal activity of mutated GC variants. However, proteostasis modulators reported so far, including ERAD inhibitors, trigger cellular stress and lead to induction of apoptosis. We show herein that lacidipine, an L-type Ca2+ channel blocker that also inhibits ryanodine receptors on the ER membrane, enhances folding, trafficking and lysosomal activity of the most severely destabilized GC variant achieved via ERAD inhibition in fibroblasts derived from patients with GD. Interestingly, reprogramming the proteostasis network by combining modulation of Ca2+ homeostasis and ERAD inhibition remodels the unfolded protein response and dramatically lowers apoptosis induction typically associated with ERAD inhibition
A network meta-analysis of 12,116 individuals from randomized controlled trials in the treatment of depression after acute coronary syndrome
Background:
Post-acute coronary syndrome (ACS) depression is a common but not well understood complication experienced by ACS patients. Research on the effectiveness of various therapies remains limited. Hence, we sought to conduct a network meta-analysis to assess the efficacy of different interventions for post-ACS depression in improving patient outcomes.
Methods and findings:
Three electronic databases were searched for randomised controlled trials describing different depression treatment modalities in post-ACS patients. Each article was screened based on inclusion criteria and relevant data were extracted. A bivariate analysis and a network meta-analysis was performed using risk ratios (RR) and standardized mean differences (SMD) for binary and continuous outcomes, respectively.
A total of 30 articles were included in our analysis. Compared to standard care, psychosocial therapy was associated with the greatest reduction in depression scores (SMD:-1.21, 95% CI: -1.81 to -0.61, p<0.001), followed by cognitive behavioural therapy (CBT) (SMD: -0.75, 95% CI: -0.99 to -0.52, p<0.001), antidepressants (SMD: -0.73, 95% CI: -1.14 to -0.31, p<0.001), and lastly, combination therapy (SMD: -0.15, 95% CI: -0.28 to -0.03, p = 0.016). No treatment modalities was found to be more effective in reducing depression scores when compared to one another. Additional analysis showed that these treatment modalities did not have significant impact on the overall mortality, cardiac mortality and recurrent myocardial infarction.
Conclusion:
This network meta-analysis found that the treatment effect of the various psychological modalities on depression severity were similar. Future trials on psychological interventions assessing clinical outcomes and improvement in adherence to ACS-specific interventions are needed
Rapid Effects of Hearing Song on Catecholaminergic Activity in the Songbird Auditory Pathway
Catecholaminergic (CA) neurons innervate sensory areas and affect the processing of sensory signals. For example, in birds, CA fibers innervate the auditory pathway at each level, including the midbrain, thalamus, and forebrain. We have shown previously that in female European starlings, CA activity in the auditory forebrain can be enhanced by exposure to attractive male song for one week. It is not known, however, whether hearing song can initiate that activity more rapidly. Here, we exposed estrogen-primed, female white-throated sparrows to conspecific male song and looked for evidence of rapid synthesis of catecholamines in auditory areas. In one hemisphere of the brain, we used immunohistochemistry to detect the phosphorylation of tyrosine hydroxylase (TH), a rate-limiting enzyme in the CA synthetic pathway. We found that immunoreactivity for TH phosphorylated at serine 40 increased dramatically in the auditory forebrain, but not the auditory thalamus and midbrain, after 15 min of song exposure. In the other hemisphere, we used high pressure liquid chromatography to measure catecholamines and their metabolites. We found that two dopamine metabolites, dihydroxyphenylacetic acid and homovanillic acid, increased in the auditory forebrain but not the auditory midbrain after 30 min of exposure to conspecific song. Our results are consistent with the hypothesis that exposure to a behaviorally relevant auditory stimulus rapidly induces CA activity, which may play a role in auditory responses
Impact of primary kidney disease on the effects of empagliflozin in patients with chronic kidney disease: secondary analyses of the EMPA-KIDNEY trial
Background: The EMPA KIDNEY trial showed that empagliflozin reduced the risk of the primary composite outcome of kidney disease progression or cardiovascular death in patients with chronic kidney disease mainly through slowing progression. We aimed to assess how effects of empagliflozin might differ by primary kidney disease across its broad population. Methods: EMPA-KIDNEY, a randomised, controlled, phase 3 trial, was conducted at 241 centres in eight countries (Canada, China, Germany, Italy, Japan, Malaysia, the UK, and the USA). Patients were eligible if their estimated glomerular filtration rate (eGFR) was 20 to less than 45 mL/min per 1·73 m2, or 45 to less than 90 mL/min per 1·73 m2 with a urinary albumin-to-creatinine ratio (uACR) of 200 mg/g or higher at screening. They were randomly assigned (1:1) to 10 mg oral empagliflozin once daily or matching placebo. Effects on kidney disease progression (defined as a sustained ≥40% eGFR decline from randomisation, end-stage kidney disease, a sustained eGFR below 10 mL/min per 1·73 m2, or death from kidney failure) were assessed using prespecified Cox models, and eGFR slope analyses used shared parameter models. Subgroup comparisons were performed by including relevant interaction terms in models. EMPA-KIDNEY is registered with ClinicalTrials.gov, NCT03594110. Findings: Between May 15, 2019, and April 16, 2021, 6609 participants were randomly assigned and followed up for a median of 2·0 years (IQR 1·5–2·4). Prespecified subgroupings by primary kidney disease included 2057 (31·1%) participants with diabetic kidney disease, 1669 (25·3%) with glomerular disease, 1445 (21·9%) with hypertensive or renovascular disease, and 1438 (21·8%) with other or unknown causes. Kidney disease progression occurred in 384 (11·6%) of 3304 patients in the empagliflozin group and 504 (15·2%) of 3305 patients in the placebo group (hazard ratio 0·71 [95% CI 0·62–0·81]), with no evidence that the relative effect size varied significantly by primary kidney disease (pheterogeneity=0·62). The between-group difference in chronic eGFR slopes (ie, from 2 months to final follow-up) was 1·37 mL/min per 1·73 m2 per year (95% CI 1·16–1·59), representing a 50% (42–58) reduction in the rate of chronic eGFR decline. This relative effect of empagliflozin on chronic eGFR slope was similar in analyses by different primary kidney diseases, including in explorations by type of glomerular disease and diabetes (p values for heterogeneity all >0·1). Interpretation: In a broad range of patients with chronic kidney disease at risk of progression, including a wide range of non-diabetic causes of chronic kidney disease, empagliflozin reduced risk of kidney disease progression. Relative effect sizes were broadly similar irrespective of the cause of primary kidney disease, suggesting that SGLT2 inhibitors should be part of a standard of care to minimise risk of kidney failure in chronic kidney disease. Funding: Boehringer Ingelheim, Eli Lilly, and UK Medical Research Council
Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study
Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk
Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.
Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability
Global patient outcomes after elective surgery: prospective cohort study in 27 low-, middle- and high-income countries.
BACKGROUND: As global initiatives increase patient access to surgical treatments, there remains a need to understand the adverse effects of surgery and define appropriate levels of perioperative care. METHODS: We designed a prospective international 7-day cohort study of outcomes following elective adult inpatient surgery in 27 countries. The primary outcome was in-hospital complications. Secondary outcomes were death following a complication (failure to rescue) and death in hospital. Process measures were admission to critical care immediately after surgery or to treat a complication and duration of hospital stay. A single definition of critical care was used for all countries. RESULTS: A total of 474 hospitals in 19 high-, 7 middle- and 1 low-income country were included in the primary analysis. Data included 44 814 patients with a median hospital stay of 4 (range 2-7) days. A total of 7508 patients (16.8%) developed one or more postoperative complication and 207 died (0.5%). The overall mortality among patients who developed complications was 2.8%. Mortality following complications ranged from 2.4% for pulmonary embolism to 43.9% for cardiac arrest. A total of 4360 (9.7%) patients were admitted to a critical care unit as routine immediately after surgery, of whom 2198 (50.4%) developed a complication, with 105 (2.4%) deaths. A total of 1233 patients (16.4%) were admitted to a critical care unit to treat complications, with 119 (9.7%) deaths. Despite lower baseline risk, outcomes were similar in low- and middle-income compared with high-income countries. CONCLUSIONS: Poor patient outcomes are common after inpatient surgery. Global initiatives to increase access to surgical treatments should also address the need for safe perioperative care. STUDY REGISTRATION: ISRCTN5181700
- …