122 research outputs found
Direct monitoring of active geohazards: emerging geophysical tools for deep-water assessments
Seafloor networks of cables, pipelines, and other infrastructure underpin our daily lives, providing communication links, information, and energy supplies. Despite their global importance, these networks are vulnerable to damage by a number of natural seafloor hazards, including landslides, turbidity currents, fluid flow, and scour. Conventional geophysical techniques, such as high-resolution reflection seismic and side-scan sonar, are commonly employed in geohazard assessments. These conventional tools provide essential information for route planning and design; however, such surveys provide only indirect evidence of past processes and do not observe or measure the geohazard itself. As such, many numerical-based impact models lack field-scale calibration, and much uncertainty exists about the triggers, nature, and frequency of deep-water geohazards. Recent advances in technology now enable a step change in their understanding through direct monitoring. We outline some emerging monitoring tools and how they can quantify key parameters for deepwater geohazard assessment. Repeat seafloor surveys in dynamic areas show that solely relying on evidence from past deposits can lead to an under-representation of the geohazard events. Acoustic Doppler current profiling provides new insights into the structure of turbidity currents, whereas instrumented mobile sensors record the nature of movement at the base of those flows for the first time. Existing and bespoke cabled networks enable high bandwidth, low power, and distributed measurements of parameters such as strain across large areas of seafloor. These techniques provide valuable new measurements that will improve geohazard assessments and should be deployed in a complementary manner alongside conventional geophysical tools
Assessing the feasibility and value of employing an ecosystem services approach in chemical environmental risk assessment under the Water Framework Directive
The feasibility and added value of an ecosystem services approach in retrospective environmental risk assessment were evaluated using a site-specific case study in a lowland UK river. The studied water body failed to achieve good ecological status temporarily in 2018, due in part to the exceedance of the environmental quality standard (annual average EQS) for zinc. Potential ecosystem service delivery was quantified for locally prioritised ecosystem services: regulation of chemical condition; maintaining nursery populations and habitats; recreational fishing; nature watching. Quantification was based on observed and expected taxa or functional groups within WFD biological quality elements, including macrophytes, benthic macroinvertebrates and fish, and on published functional trait data for constituent taxa. Benthic macroinvertebrate taxa were identified and enumerated before, during and after zinc EQS exceedance, enabling a generic retrospective risk assessment for this biological quality element, which was found to have good ecosystem service potential. An additional targeted risk assessment for zinc was based on laboratory-based species sensitivity distributions normalised using biotic-ligand modelling to account for site-specific, bioavailability-corrected zinc exposure. Risk to ecosystem services for diatoms (microalgae) was found to be high, while risks for benthic macroinvertebrates and fish were found to be low. The status of potential ecosystem service delivery (ESD) by fish was equivalent to high ecological status defined under the WFD, while ESD was higher for benthic macroinvertebrates than defined by WFD methods. The illustrated ecosystem services approach uses readily available data and adds significantly to the taxonomic approach currently used under the WFD by using functional traits to evaluate services that are prioritised as being important in water bodies. The main shortcomings of the illustrated approach were lack of: representation of bacteria and fungi; WFD predicted species lists for diatoms and macrophytes; site-specific functional trait data required for defining actual (rather than potential) ecosystem service delivery
Memory enhancing drugs and Alzheimer’s Disease: Enhancing the self or preventing the loss of it?
In this paper we analyse some ethical and philosophical questions related to the development of memory enhancing drugs (MEDs) and anti-dementia drugs. The world of memory enhancement is coloured by utopian thinking and by the desire for quicker, sharper, and more reliable memories. Dementia is characterized by decline, fragility, vulnerability, a loss of the most important cognitive functions and even a loss of self. While MEDs are being developed for self-improvement, in Alzheimer’s Disease (AD) the self is being lost. Despite this it is precisely those patients with AD and other forms of dementia that provide the subjects for scientific research on memory improvement. Biomedical research in the field of MEDs and anti-dementia drugs appears to provide a strong impetus for rethinking what we mean by ‘memory’, ‘enhancement’, ‘therapy’, and ‘self’. We conclude (1) that the enhancement of memory is still in its infancy, (2) that current MEDs and anti-dementia drugs are at best partially and minimally effective under specific conditions, (3) that ‘memory᾿and ‘enhancement᾿are ambiguous terms, (4) that there is no clear-cut distinction between enhancement and therapy, and (5) that the research into MEDs and anti-dementia drugs encourages a reductionistic view of the human mind and of the self
The PHENIX Experiment at RHIC
The physics emphases of the PHENIX collaboration and the design and current
status of the PHENIX detector are discussed. The plan of the collaboration for
making the most effective use of the available luminosity in the first years of
RHIC operation is also presented.Comment: 5 pages, 1 figure. Further details of the PHENIX physics program
available at http://www.rhic.bnl.gov/phenix
Nucleosomes in gene regulation: theoretical approaches
This work reviews current theoretical approaches of biophysics and
bioinformatics for the description of nucleosome arrangements in chromatin and
transcription factor binding to nucleosomal organized DNA. The role of
nucleosomes in gene regulation is discussed from molecular-mechanistic and
biological point of view. In addition to classical problems of this field,
actual questions of epigenetic regulation are discussed. The authors selected
for discussion what seem to be the most interesting concepts and hypotheses.
Mathematical approaches are described in a simplified language to attract
attention to the most important directions of this field
A genome-wide association study of total child psychiatric problems scores
Substantial genetic correlations have been reported across psychiatric disorders and numerous cross-disorder genetic variants have been detected. To identify the genetic variants underlying general psychopathology in childhood, we performed a genome-wide association study using a total psychiatric problem score. We analyzed 6,844,199 common SNPs in 38,418 school-aged children from 20 population-based cohorts participating in the EAGLE consortium. The SNP heritability of total psychiatric problems was 5.4% (SE = 0.01) and two loci reached genome-wide significance: rs10767094 and rs202005905. We also observed an association of SBF2, a gene associated with neuroticism in previous GWAS, with total psychiatric problems. The genetic effects underlying the total score were shared with common psychiatric disorders only (attention-deficit/hyperactivity disorder, anxiety, depression, insomnia) (rG > 0.49), but not with autism or the less common adult disorders (schizophrenia, bipolar disorder, or eating disorders) (rG 0.29). The results suggest that many common genetic variants are associated with childhood psychiatric symptoms and related phenotypes in general instead of with specific symptoms. Further research is needed to establish causality and pleiotropic mechanisms between related traits.</p
Para-infectious brain injury in COVID-19 persists at follow-up despite attenuated cytokine and autoantibody responses
To understand neurological complications of COVID-19 better both acutely and for recovery, we measured markers of brain injury, inflammatory mediators, and autoantibodies in 203 hospitalised participants; 111 with acute sera (1–11 days post-admission) and 92 convalescent sera (56 with COVID-19-associated neurological diagnoses). Here we show that compared to 60 uninfected controls, tTau, GFAP, NfL, and UCH-L1 are increased with COVID-19 infection at acute timepoints and NfL and GFAP are significantly higher in participants with neurological complications. Inflammatory mediators (IL-6, IL-12p40, HGF, M-CSF, CCL2, and IL-1RA) are associated with both altered consciousness and markers of brain injury. Autoantibodies are more common in COVID-19 than controls and some (including against MYL7, UCH-L1, and GRIN3B) are more frequent with altered consciousness. Additionally, convalescent participants with neurological complications show elevated GFAP and NfL, unrelated to attenuated systemic inflammatory mediators and to autoantibody responses. Overall, neurological complications of COVID-19 are associated with evidence of neuroglial injury in both acute and late disease and these correlate with dysregulated innate and adaptive immune responses acutely
Clustering identifies endotypes of traumatic brain injury in an intensive care cohort: a CENTER-TBI study
Background
While the Glasgow coma scale (GCS) is one of the strongest outcome predictors, the current classification of traumatic brain injury (TBI) as ‘mild’, ‘moderate’ or ‘severe’ based on this fails to capture enormous heterogeneity in pathophysiology and treatment response. We hypothesized that data-driven characterization of TBI could identify distinct endotypes and give mechanistic insights.
Methods
We developed an unsupervised statistical clustering model based on a mixture of probabilistic graphs for presentation (< 24 h) demographic, clinical, physiological, laboratory and imaging data to identify subgroups of TBI patients admitted to the intensive care unit in the CENTER-TBI dataset (N = 1,728). A cluster similarity index was used for robust determination of optimal cluster number. Mutual information was used to quantify feature importance and for cluster interpretation.
Results
Six stable endotypes were identified with distinct GCS and composite systemic metabolic stress profiles, distinguished by GCS, blood lactate, oxygen saturation, serum creatinine, glucose, base excess, pH, arterial partial pressure of carbon dioxide, and body temperature. Notably, a cluster with ‘moderate’ TBI (by traditional classification) and deranged metabolic profile, had a worse outcome than a cluster with ‘severe’ GCS and a normal metabolic profile. Addition of cluster labels significantly improved the prognostic precision of the IMPACT (International Mission for Prognosis and Analysis of Clinical trials in TBI) extended model, for prediction of both unfavourable outcome and mortality (both p < 0.001).
Conclusions
Six stable and clinically distinct TBI endotypes were identified by probabilistic unsupervised clustering. In addition to presenting neurology, a profile of biochemical derangement was found to be an important distinguishing feature that was both biologically plausible and associated with outcome. Our work motivates refining current TBI classifications with factors describing metabolic stress. Such data-driven clusters suggest TBI endotypes that merit investigation to identify bespoke treatment strategies to improve care
- …