3,203 research outputs found
Paradoxical signaling regulates structural plasticity in dendritic spines
Transient spine enlargement (3-5 min timescale) is an important event
associated with the structural plasticity of dendritic spines. Many of the
molecular mechanisms associated with transient spine enlargement have been
identified experimentally. Here, we use a systems biology approach to construct
a mathematical model of biochemical signaling and actin-mediated transient
spine expansion in response to calcium-influx due to NMDA receptor activation.
We have identified that a key feature of this signaling network is the
paradoxical signaling loop. Paradoxical components act bifunctionally in
signaling networks and their role is to control both the activation and
inhibition of a desired response function (protein activity or spine volume).
Using ordinary differential equation (ODE)-based modeling, we show that the
dynamics of different regulators of transient spine expansion including CaMKII,
RhoA, and Cdc42 and the spine volume can be described using paradoxical
signaling loops. Our model is able to capture the experimentally observed
dynamics of transient spine volume. Furthermore, we show that actin remodeling
events provide a robustness to spine volume dynamics. We also generate
experimentally testable predictions about the role of different components and
parameters of the network on spine dynamics
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 âHigh-Performance Modelling and Simulation for Big Data Applications (cHiPSet)â project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
NASA SBIR abstracts of 1990 phase 1 projects
The research objectives of the 280 projects placed under contract in the National Aeronautics and Space Administration (NASA) 1990 Small Business Innovation Research (SBIR) Phase 1 program are described. The basic document consists of edited, non-proprietary abstracts of the winning proposals submitted by small businesses in response to NASA's 1990 SBIR Phase 1 Program Solicitation. The abstracts are presented under the 15 technical topics within which Phase 1 proposals were solicited. Each project was assigned a sequential identifying number from 001 to 280, in order of its appearance in the body of the report. The document also includes Appendixes to provide additional information about the SBIR program and permit cross-reference in the 1990 Phase 1 projects by company name, location by state, principal investigator, NASA field center responsible for management of each project, and NASA contract number
A hybrid algorithm for Bayesian network structure learning with application to multi-label learning
We present a novel hybrid algorithm for Bayesian network structure learning,
called H2PC. It first reconstructs the skeleton of a Bayesian network and then
performs a Bayesian-scoring greedy hill-climbing search to orient the edges.
The algorithm is based on divide-and-conquer constraint-based subroutines to
learn the local structure around a target variable. We conduct two series of
experimental comparisons of H2PC against Max-Min Hill-Climbing (MMHC), which is
currently the most powerful state-of-the-art algorithm for Bayesian network
structure learning. First, we use eight well-known Bayesian network benchmarks
with various data sizes to assess the quality of the learned structure returned
by the algorithms. Our extensive experiments show that H2PC outperforms MMHC in
terms of goodness of fit to new data and quality of the network structure with
respect to the true dependence structure of the data. Second, we investigate
H2PC's ability to solve the multi-label learning problem. We provide
theoretical results to characterize and identify graphically the so-called
minimal label powersets that appear as irreducible factors in the joint
distribution under the faithfulness condition. The multi-label learning problem
is then decomposed into a series of multi-class classification problems, where
each multi-class variable encodes a label powerset. H2PC is shown to compare
favorably to MMHC in terms of global classification accuracy over ten
multi-label data sets covering different application domains. Overall, our
experiments support the conclusions that local structural learning with H2PC in
the form of local neighborhood induction is a theoretically well-motivated and
empirically effective learning framework that is well suited to multi-label
learning. The source code (in R) of H2PC as well as all data sets used for the
empirical tests are publicly available.Comment: arXiv admin note: text overlap with arXiv:1101.5184 by other author
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 âHigh-Performance Modelling and Simulation for Big Data Applications (cHiPSet)â project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
The Role of Synaptic Tagging and Capture for Memory Dynamics in Spiking Neural Networks
Memory serves to process and store information about experiences such that this information can be
used in future situations. The transfer from transient storage into long-term memory, which retains
information for hours, days, and even years, is called consolidation. In brains, information is primarily
stored via alteration of synapses, so-called synaptic plasticity. While these changes are at first in a
transient early phase, they can be transferred to a late phase, meaning that they become stabilized
over the course of several hours. This stabilization has been explained by so-called synaptic tagging
and capture (STC) mechanisms. To store and recall memory representations, emergent dynamics arise
from the synaptic structure of recurrent networks of neurons. This happens through so-called cell
assemblies, which feature particularly strong synapses. It has been proposed that the stabilization
of such cell assemblies by STC corresponds to so-called synaptic consolidation, which is observed in
humans and other animals in the first hours after acquiring a new memory. The exact connection
between the physiological mechanisms of STC and memory consolidation remains, however, unclear.
It is equally unknown which influence STC mechanisms exert on further cognitive functions that guide
behavior. On timescales of minutes to hours (that means, the timescales of STC) such functions include
memory improvement, modification of memories, interference and enhancement of similar memories,
and transient priming of certain memories. Thus, diverse memory dynamics may be linked to STC,
which can be investigated by employing theoretical methods based on experimental data from the
neuronal and the behavioral level.
In this thesis, we present a theoretical model of STC-based memory consolidation in recurrent networks of spiking neurons, which are particularly suited to reproduce biologically realistic dynamics.
Furthermore, we combine the STC mechanisms with calcium dynamics, which have been found to
guide the major processes of early-phase synaptic plasticity in vivo. In three included research articles as well as additional sections, we develop this model and investigate how it can account for a
variety of behavioral effects. We find that the model enables the robust implementation of the cognitive memory functions mentioned above. The main steps to this are: 1. demonstrating the formation, consolidation, and improvement of memories represented by cell assemblies, 2. showing that
neuromodulator-dependent STC can retroactively control whether information is stored in a temporal
or rate-based neural code, and 3. examining interaction of multiple cell assemblies with transient and
attractor dynamics in different organizational paradigms.
In summary, we demonstrate several ways by which STC controls the late-phase synaptic structure
of cell assemblies. Linking these structures to functional dynamics, we show that our STC-based model
implements functionality that can be related to long-term memory. Thereby, we provide a basis for the
mechanistic explanation of various neuropsychological effects.2021-09-0
- âŠ