2 research outputs found

    Advances in understanding calcite varve formation: new insights from a dual lake monitoring approach in the southern Baltic lowlands

    No full text
    We revise the conceptual model of calcite varves and present, for the first time, a dual lake monitoring study in two alkaline lakes providing new insights into the seasonal sedimentation processes forming these varves. The study lakes, Tiefer See in NE Germany and Czechowskie in N Poland, have distinct morphology and bathymetry, and therefore, they are ideal to decipher local effects on seasonal deposition. The monitoring setup in both lakes is largely identical and includes instrumental observation of (i) meteorological parameters, (ii) chemical profiling of the lake water column including water sampling, and (iii) sediment trapping at both bi‐weekly and monthly intervals. We then compare our monitoring data with varve micro‐facies in the sediment record. One main finding is that calcite varves form complex laminae triplets rather than simple couplets as commonly thought. Sedimentation of varve sub‐layers in both lakes is largely dependent on the lake mixing dynamics and results from the same seasonality, commencing with diatom blooms in spring turning into a pulse of calcite precipitation in summer and terminating with a re‐suspension layer in autumn and winter, composed of calcite patches, plant fragments and benthic diatoms. Despite the common seasonal cycle, the share of each of these depositional phases in the total annual sediment yield is different between the lakes. In Lake Tiefer See calcite sedimentation has the highest yields, whereas in Lake Czechowskie, the so far underestimated re‐suspension sub‐layer dominates the sediment accumulation. Even in undisturbed varved sediments, re‐suspended material becomes integrated in the sediment fabric and makes up an important share of calcite varves. Thus, while the biogeochemical lake cycle defines the varves’ autochthonous components and micro‐facies, the physical setting plays an important role in determining the varve sub‐layers’ proportion.Leibniz‐Gemeinschaft http://dx.doi.org/10.13039/501100001664Narodowe Centrum Nauki http://dx.doi.org/10.13039/501100004281Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659ICLE

    The spinal cord injury-induced immune deficiency syndrome: results of the SCIentinel study

    No full text
    Infections are prevalent after spinal cord injury (SCI), constitute the main cause of death and are a rehabilitation confounder associated with impaired recovery. We hypothesize that SCI causes an acquired lesion-dependent (neurogenic) immune suppression as an underlying mechanism to facilitate infections. The international prospective multicentre cohort study (SCIentinel; protocol registration DRKS00000122; n = 111 patients) was designed to distinguish neurogenic from general trauma-related effects on the immune system. Therefore, SCI patient groups differing by neurological level, i.e. high SCI [thoracic (Th)4 or higher]; low SCI (Th5 or lower) and severity (complete SCI; incomplete SCI), were compared with a reference group of vertebral fracture (VF) patients without SCI. The primary outcome was quantitative monocytic Human Leukocyte Antigen-DR expression (mHLA-DR, synonym MHC II), a validated marker for immune suppression in critically ill patients associated with infection susceptibility. mHLA-DR was assessed from Day 1 to 10 weeks after injury by applying standardized flow cytometry procedures. Secondary outcomes were leucocyte subpopulation counts, serum immunoglobulin levels and clinically defined infections. Linear mixed models with multiple imputation were applied to evaluate group differences of logarithmic-transformed parameters. Mean quantitative mHLA-DR [ln (antibodies/cell)] levels at the primary end point 84 h after injury indicated an immune suppressive state below the normative values of 9.62 in all groups, which further differed in its dimension by neurological level: high SCI [8.95 (98.3% confidence interval, CI: 8.63; 9.26), n = 41], low SCI [9.05 (98.3% CI: 8.73; 9.36), n = 29], and VF without SCI [9.25 (98.3% CI: 8.97; 9.53), n = 41, P = 0.003]. Post hoc analysis accounting for SCI severity revealed the strongest mHLA-DR decrease [8.79 (95% CI: 8.50; 9.08)] in the complete, high SCI group, further demonstrating delayed mHLA-DR recovery [9.08 (95% CI: 8.82; 9.38)] and showing a difference from the VF controls of −0.43 (95% CI: −0.66; −0.20) at 14 days. Complete, high SCI patients also revealed constantly lower serum immunoglobulin G [−0.27 (95% CI: −0.45; −0.10)] and immunoglobulin A [−0.25 (95% CI: −0.49; −0.01)] levels [ln (g/l × 1000)] up to 10 weeks after injury. Low mHLA-DR levels in the range of borderline immunoparalysis (below 9.21) were positively associated with the occurrence and earlier onset of infections, which is consistent with results from studies on stroke or major surgery. Spinal cord injured patients can acquire a secondary, neurogenic immune deficiency syndrome characterized by reduced mHLA-DR expression and relative hypogammaglobulinaemia (combined cellular and humoral immune deficiency). mHLA-DR expression provides a basis to stratify infection-risk in patients with SCI
    corecore