174 research outputs found
PlanHab Study: Consequences of combined normobaric hypoxia and bed rest on adenosine kinetics
Adenosine plays a role in the energy supply of cells and provokes differential, hormone-like functions in circulating cells and various tissues. Its release is importantly regulated by oxygen tension. This renders adenosine and its kinetics interesting to investigate in humans subjected to low oxygen conditions. Especially for space exploration scenarios, hypoxic conditions - together with reduced gravity - represent two foreseen living conditions when planning manned long-duration space missions or planetary habitats. The PlanHab study investigated microgravity through inactivity in bed rest and normobaric hypoxia to examine their independent or combined effect on adenosine and its kinetics. Healthy male subjects (n = 14) completed three 21-day interventions: hypoxic bed rest (HBR); hypoxic ambulatory confinement (HAMB); normoxic bed rest (NBR). The interventions were separated by 4 months. Our hypothesis of a hypoxia-triggered increase in adenosine was confirmed in HAMB but unexpectedly also in NBR. However, the highest adenosine levels were noted following HBR. Furthermore, the percentage of hemolysis was elevated in HBR whereas endothelial integrity markers stayed low in all three interventions. In summary, these data suggest that neocytolysis accounts for these effects while we could reduce evidence for microcirculatory changes
Recommended from our members
Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management
The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program at LLNL has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Several achievements in schema design, data visualization, synthesis, and analysis were completed this year. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. As data volumes have increased, scientific information management issues such as data quality assessment, ontology mapping, and metadata collection that are essential for production and validation of derived calibrations have negatively impacted researchers abilities to produce products. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Nearly a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes elements of stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable recording of processing flow and metadata. A core capability is the ability to rapidly select and present subsets of related signals and measurements to the researchers for analysis and distillation both visually (JAVA GUI client applications) and in batch mode (instantiation of multi-threaded applications on clusters of processors). Development of efficient data exploitation methods has become increasingly important throughout academic and government seismic research communities to address multi-disciplinary large scale initiatives. Effective frameworks must also simultaneously provide the researcher with robust measurement and analysis tools that can handle and extract groups of events effectively and isolate the researcher from the now onerous task of database management and metadata collection necessary for validation and error analysis. Sufficient information management robustness is required to avoid loss of metadata that would lead to incorrect calibration results in addition to increasing the data management burden. Our specific automation methodology and tools improve the researchers ability to assemble quality-controlled research products for delivery into the NNSA Knowledge Base (KB). The software and scientific automation tasks also provide the robust foundation upon which synergistic and efficient development of, GNEM R&E Program, seismic calibration research may be built
Deletion of PEA-15 in mice is associated with specific impairments of spatial learning abilities
<p>Abstract</p> <p>Background</p> <p>PEA-15 is a phosphoprotein that binds and regulates ERK MAP kinase and RSK2 and is highly expressed throughout the brain. PEA-15 alters c-Fos and CREB-mediated transcription as a result of these interactions. To determine if PEA-15 contributes to the function of the nervous system we tested mice lacking PEA-15 in a series of experiments designed to measure learning, sensory/motor function, and stress reactivity.</p> <p>Results</p> <p>We report that PEA-15 null mice exhibited impaired learning in three distinct spatial tasks, while they exhibited normal fear conditioning, passive avoidance, egocentric navigation, and odor discrimination. PEA-15 null mice also had deficient forepaw strength and in limited instances, heightened stress reactivity and/or anxiety. However, these non-cognitive variables did not appear to account for the observed spatial learning impairments. The null mice maintained normal weight, pain sensitivity, and coordination when compared to wild type controls.</p> <p>Conclusion</p> <p>We found that PEA-15 null mice have spatial learning disabilities that are similar to those of mice where ERK or RSK2 function is impaired. We suggest PEA-15 may be an essential regulator of ERK-dependent spatial learning.</p
A Dopaminergic Gene Cluster in the Prefrontal Cortex Predicts Performance Indicative of General Intelligence in Genetically Heterogeneous Mice
Background: Genetically heterogeneous mice express a trait that is qualitatively and psychometrically analogous to general intelligence in humans, and as in humans, this trait co-varies with the processing efficacy of working memory (including its dependence on selective attention). Dopamine signaling in the prefrontal cortex (PFC) has been established to play a critical role in animals ’ performance in both working memory and selective attention tasks. Owing to this role of the PFC in the regulation of working memory, here we compared PFC gene expression profiles of 60 genetically diverse CD-1 mice that exhibited a wide range of general learning abilities (i.e., aggregate performance across five diverse learning tasks). Methodology/Principal Findings: Animals ’ general cognitive abilities were first determined based on their aggregate performance across a battery of five diverse learning tasks. With a procedure designed to minimize false positive identifications, analysis of gene expression microarrays (comprised of <25,000 genes) identified a small number (,20) of genes that were differentially expressed across animals that exhibited fast and slow aggregate learning abilities. Of these genes, one functional cluster was identified, and this cluster (Darpp-32, Drd1a, and Rgs9) is an established modulator of dopamine signaling. Subsequent quantitative PCR found that expression of these dopaminegic genes plus one vascular gene (Nudt6) were significantly correlated with individual animal’s general cognitive performance. Conclusions/Significance: These results indicate that D1-mediated dopamine signaling in the PFC, possibly through it
Recommended from our members
Regional Seismic Discrimination Optimization With and Without Nuclear Test Data: Western U.S. Examples
The western U.S. has abundant natural seismicity, historic nuclear explosion data, and widespread mine blasts, making it a good testing ground to study the performance of regional source-type discrimination techniques. We have assembled and measured a large set of these events to systematically explore how to best optimize discrimination performance. Nuclear explosions can be discriminated from a background of earthquakes using regional phase (Pn, Pg, Sn, Lg) amplitude measures such as high frequency P/S ratios. The discrimination performance is improved if the amplitudes can be corrected for source size and path length effects. We show good results are achieved using earthquakes alone to calibrate for these effects with the MDAC technique (Walter and Taylor, 2001). We show significant further improvement is then possible by combining multiple MDAC amplitude ratios using an optimized weighting technique such as Linear Discriminant Analysis (LDA). However this requires data or models for both earthquakes and explosions. In many areas of the world regional distance nuclear explosion data is lacking, but mine blast data is available. Mine explosions are often designed to fracture and/or move rock, giving them different frequency and amplitude behavior than contained chemical shots, which seismically look like nuclear tests. Here we explore discrimination performance differences between explosion types, the possible disparity in the optimization parameters that would be chosen if only chemical explosions were available and the corresponding effect of that disparity on nuclear explosion discrimination. There are a variety of additional techniques in the literature also having the potential to improve regional high frequency P/S discrimination. We explore two of these here: three-component averaging and maximum phase amplitude measures. Typical discrimination studies use only the vertical component measures and for some historic regional nuclear records these are all that are available. However S-waves are often better recorded on the horizontal components and some studies have shown that using a three-component average or a vertical-P/horizontal-S or other three-component measure can improve discrimination over using the vertical alone (e.g. Kim et al. 1997; Bowers et al 2001). Here we compare the performance of vertical and three-component measures on the western U. S. test set. A complication in regional discrimination is the variation in P and S-wave propagation with region. The dominantly observed regional high frequency S-wave can vary with path between Sn and Lg in a spatially complex way. Since the relative lack of high frequency S-waves is the signature of an explosion, failing to account for this could lead to misidentifying an earthquake as an explosion. The regional P phases Pn and Pg vary similarly with path and also with distance, with Pg sometimes being a strong phase at near regional distances but not far regional. One way to try and handle these issues is to correct for all four regional phases but choose the phase with the maximum amplitude. A variation on this strategy is to always use Pn but choose the maximum S phase (e.g. Bottone et al. 2002). Here we compare the discrimination performance of several different (max P)/(max S) measures to vertical, three-component and multivariate measures. Our preliminary results show that multivariate measures perform much better than single ratios, though transportability of the LDA weights between regions is an issue. Also in our preliminary results, we do not find large discrimination performance improvements with three-component averages and maximum phase amplitude measures compared to using the vertical component alone
Pre-shot simulations of far-field ground motion for the Source Physics Experiment (SPE) Explosions at the Climax Stock, Nevada National Security Site: SPE2
The Source Physics Experiment (SPE) is planning a 1000 kg (TNT equivalent) shot (SPE2) at the Nevada National Security Site (NNSS) in a granite borehole at a depth (canister centroid) of 45 meters. This shot follows an earlier shot of 100 kg in the same borehole at a depth 60 m. Surrounding the shotpoint is an extensive array of seismic sensors arrayed in 5 radial lines extending out 2 km to the north and east and approximately 10-15 to the south and west. Prior to SPE1, simulations using a finite difference code and a 3D numerical model based on the geologic setting were conducted, which predicted higher amplitudes to the south and east in the alluvium of Yucca Flat along with significant energy on the transverse components caused by scattering within the 3D volume along with some contribution by topographic scattering. Observations from the SPE1 shot largely confirmed these predictions although the ratio of transverse energy relative to the vertical and radial components was in general larger than predicted. A new set of simulations has been conducted for the upcoming SPE2 shot. These include improvements to the velocity model based on SPE1 observations as well as new capabilities added to the simulation code. The most significant is the addition of a new source model within the finite difference code by using the predicted ground velocities from a hydrodynamic code (GEODYN) as driving condition on the boundaries of a cube embedded within WPP which provides a more sophisticated source modeling capability linked directly to source site materials (e.g. granite) and type and size of source. Two sets of SPE2 simulations are conducted, one with a GEODYN source and 3D complex media (no topography node spacing of 5 m) and one with a standard isotropic pre-defined time function (3D complex media with topography, node spacing of 5 m). Results were provided as time series at specific points corresponding to sensor locations for both translational (x,y,z) and rotational components. Estimates of spectral scaling for SPE2 are provided using a modified version of the Mueller-Murphy model. An estimate of expected aftershock probabilities were also provided, based on the methodology of Ford and Walter, [2010]
Ubiquitous molecular substrates for associative learning and activity-dependent neuronal facilitation.
Recent evidence suggests that many of the molecular cascades and substrates that contribute to learning-related forms of neuronal plasticity may be conserved across ostensibly disparate model systems. Notably, the facilitation of neuronal excitability and synaptic transmission that contribute to associative learning in Aplysia and Hermissenda, as well as associative LTP in hippocampal CA1 cells, all require (or are enhanced by) the convergence of a transient elevation in intracellular Ca2+ with transmitter binding to metabotropic cell-surface receptors. This temporal convergence of Ca2+ and G-protein-stimulated second-messenger cascades synergistically stimulates several classes of serine/threonine protein kinases, which in turn modulate receptor function or cell excitability through the phosphorylation of ion channels. We present a summary of the biophysical and molecular constituents of neuronal and synaptic facilitation in each of these three model systems. Although specific components of the underlying molecular cascades differ across these three systems, fundamental aspects of these cascades are widely conserved, leading to the conclusion that the conceptual semblance of these superficially disparate systems is far greater than is generally acknowledged. We suggest that the elucidation of mechanistic similarities between different systems will ultimately fulfill the goal of the model systems approach, that is, the description of critical and ubiquitous features of neuronal and synaptic events that contribute to memory induction
A prospective longitudinal cohort study on risk factors for COVID-19 vaccination failure (RisCoin): methods, procedures and characterization of the cohort
- …