58 research outputs found
Replicability and Generalizability of Posttraumatic Stress Disorder (PTSD) Networks: A Cross-Cultural Multisite Study of PTSD Symptoms in Four Trauma Patient Samples
The growing literature conceptualizing mental disorders like posttraumatic stress disorder (PTSD) as networks of interacting symptoms faces three key challenges. Prior studies predominantly used (a) small samples with low power for precise estimation, (b) nonclinical samples, and (c) single samples. This renders network structures in clinical data, and the extent to which networks replicate across data sets, unknown. To overcome these limitations, the present cross-cultural multisite study estimated regularized partial correlation networks of 16 PTSD symptoms across four data sets of traumatized patients receiving treatment for PTSD (total N = 2,782). Despite differences in culture, trauma type, and severity of the samples, considerable similarities emerged, with moderate to high correlations between symptom profiles (0.43-0.82), network structures (0.62-0.74), and centrality estimates (0.63-0.75). We discuss the importance of future replicability efforts to improve clinical psychological science and provide code, model output, and correlation matrices to make the results of this article fully reproducible
Zebrafish Kidney Phagocytes Utilize Macropinocytosis and Ca2+-Dependent Endocytic Mechanisms
Background: The innate immune response constitutes the first line of defense against invading pathogens and consists of a variety of immune defense mechanisms including active endocytosis by macrophages and granulocytes. Endocytosis can be used as a reliable measure of selective and non-selective mechanisms of antigen uptake in the early phase of an immune response. Numerous assays have been developed to measure this response in a variety of mammalian and fish species. The small size of the zebrafish has prevented the large-scale collection of monocytes/macrophages and granulocytes for these endocytic assays. Methodology/Principal Findings: Pooled zebrafish kidney hematopoietic tissues were used as a source of phagocytic cells for flow-cytometry based endocytic assays. FITC-Dextran, Lucifer Yellow and FITC-Edwardsiella ictaluri were used to evaluate selective and non-selective mechanisms of uptake in zebrafish phagocytes. Conclusions/Significance: Zebrafish kidney phagocytes characterized as monocytes/macrophages, neutrophils and lymphocytes utilize macropinocytosis and Ca 2+-dependant endocytosis mechanisms of antigen uptake. These cells do not appear to utilize a mannose receptor. Heat-killed Edwardsiella ictaluri induces cytoskeletal interactions for internalization in zebrafish kidney monocytes/macrophages and granulocytes. The proposed method is easy to implement and should prove especially useful in immunological, toxicological and epidemiological research
Association between loop diuretic dose changes and outcomes in chronic heart failure: observations from the ESC-EORP Heart Failure Long-Term Registry
[Abstract]
Aims. Guidelines recommend down-titration of loop diuretics (LD) once euvolaemia is achieved. In outpatients with heart
failure (HF), we investigated LD dose changes in daily cardiology practice, agreement with guideline recommendations,
predictors of successful LD down-titration and association between dose changes and outcomes.
Methods
and results.
We included 8130 HF patients from the ESC-EORP Heart Failure Long-Term Registry. Among patients who had dose
decreased, successful decrease was defined as the decrease not followed by death, HF hospitalization, New York Heart
Association class deterioration, or subsequent increase in LD dose. Mean age was 66±13 years, 71% men, 62% HF
with reduced ejection fraction, 19% HF with mid-range ejection fraction, 19% HF with preserved ejection fraction.
Median [interquartile range (IQR)] LD dose was 40 (25–80) mg. LD dose was increased in 16%, decreased in 8.3%
and unchanged in 76%. Median (IQR) follow-up was 372 (363–419) days. Diuretic dose increase (vs. no change) was
associated with HF death [hazard ratio (HR) 1.53, 95% confidence interval (CI) 1.12–2.08; P = 0.008] and nominally
with cardiovascular death (HR 1.25, 95% CI 0.96–1.63; P = 0.103). Decrease of diuretic dose (vs. no change) was
associated with nominally lower HF (HR 0.59, 95% CI 0.33–1.07; P = 0.083) and cardiovascular mortality (HR 0.62 95% CI 0.38–1.00; P = 0.052). Among patients who had LD dose decreased, systolic blood pressure [odds ratio
(OR) 1.11 per 10 mmHg increase, 95% CI 1.01–1.22; P = 0.032], and absence of (i) sleep apnoea (OR 0.24, 95% CI
0.09–0.69; P = 0.008), (ii) peripheral congestion (OR 0.48, 95% CI 0.29–0.80; P = 0.005), and (iii) moderate/severe
mitral regurgitation (OR 0.57, 95% CI 0.37–0.87; P = 0.008) were independently associated with successful decrease.
Conclusion. Diuretic dose was unchanged in 76% and decreased in 8.3% of outpatients with chronic HF. LD dose increase was
associated with worse outcomes, while the LD dose decrease group showed a trend for better outcomes compared
with the no-change group. Higher systolic blood pressure, and absence of (i) sleep apnoea, (ii) peripheral congestion,
and (iii) moderate/severe mitral regurgitation were independently associated with successful dose decrease
Sex- and age-related differences in the management and outcomes of chronic heart failure: an analysis of patients from the ESC HFA EORP Heart Failure Long-Term Registry
Aims: This study aimed to assess age- and sex-related differences in management and 1-year risk for all-cause mortality and hospitalization in chronic heart failure (HF) patients. Methods and results: Of 16 354 patients included in the European Society of Cardiology Heart Failure Long-Term Registry, 9428 chronic HF patients were analysed [median age: 66 years; 28.5% women; mean left ventricular ejection fraction (LVEF) 37%]. Rates of use of guideline-directed medical therapy (GDMT) were high (angiotensin-converting enzyme inhibitors/angiotensin receptor blockers, beta-blockers and mineralocorticoid receptor antagonists: 85.7%, 88.7% and 58.8%, respectively). Crude GDMT utilization rates were lower in women than in men (all differences: P\ua0 64 0.001), and GDMT use became lower with ageing in both sexes, at baseline and at 1-year follow-up. Sex was not an independent predictor of GDMT prescription; however, age >75 years was a significant predictor of GDMT underutilization. Rates of all-cause mortality were lower in women than in men (7.1% vs. 8.7%; P\ua0=\ua00.015), as were rates of all-cause hospitalization (21.9% vs. 27.3%; P\ua075 years. Conclusions: There was a decline in GDMT use with advanced age in both sexes. Sex was not an independent predictor of GDMT or adverse outcomes. However, age >75 years independently predicted lower GDMT use and higher all-cause mortality in patients with LVEF 6445%
Spring Wheat Leaf Appearance and Temperature: Extending the Paradigm?
Extensive research shows temperature to be the primary environmental factor controlling the phyllochron, or rate of leaf appearance, of wheat (Triticum aestivum L.). Experimental results suggest that soil temperature at crown depth, rather than air temperature above the canopy, would better predict wheat leaf appearance rates. To test this hypothesis, leaf appearance in spring wheat (\u27Nordic\u27) was measured in a 2-year field experiment (Nunn clay loam soil; fine, smectitic, mesic Aridic, Argiustoll) with three planting dates and two soil temperature treatments. One temperature treatment (denoted +3C) consisted of heating the soil at crown depth to 3 °C above the ambient soil temperature (denoted +OC). Main stem cumulative leaf number was measured at least weekly until flag leaf emergence. Leaf appearance was essentially linear with both air and soil growing degree-days (GDD), although there was a stronger linear relationship with soil GDD in the +OC plants than in +3C plants. A weak positive relationship between planting date and the phyllochron was observed. Unexpectedly, we found that heating the soil did not increase the rate of leaf appearance, as the paradigm would predict. To explain these results, we propose extending the paradigm in two ways. First, three processes are involved in leaf appearance: (1) cell division at the shoot apex forms the primordium; (2) cell division in the intercalary meristem forms the cells that then (3) expand to produce the leaf. Cell division is predominately controlled by temperature, but cell expansion is considerably more affected by factors other than temperature, explaining the influence of other factors on the phyllochron. Secondly, the vertical distribution of the two meristems and region of cell expansion occur over a significant distance, where temperature varies considerably, and temperature at a specific point (e.g. crown depth) does not account for the entire temperature regime under which leaves are developing
Characterization of the highly divergent U2 RNA homolog in the microsporidian Vairimorpha necatrix.
An RNA homologous to U2 RNA and a single copy gene encoding the RNA homolog have been characterized in the microsporidian, Vairimorpha necatrix. The RNA which is 165 nucleotides in length possesses significant similarity to U2 RNA, particularly in the 5' half of the molecule. The U2 homolog contains the highly conserved GUAGUA branch point binding sequence seen in all U2 RNAs except those of the trypanosomes. A U2 RNA sequence element implicated in a U2:U6 RNA intermolecular pairing is also present in the U2 homolog. The V. necatrix U2 RNA homolog differs at positions previously found to be invariant in U2 RNAs and appears to lack an Sm binding site sequence. The RNA can be folded into a secondary structure possessing three of the four principal stem-loops proposed for the consensus U2 RNA structure. A cis-diol containing cap structure is present at the 5' end of the U2 homolog. Unlike the cap structures seen in U-snRNAs and mRNAs it is neither 2,2,7-trimethylguanosine, gamma-monomethyl phosphate, nor 7-methylguanosine
- …