63,999 research outputs found
Engineering simulations for cancer systems biology
Computer simulation can be used to inform in vivo and in vitro experimentation, enabling rapid, low-cost hypothesis generation and directing experimental design in order to test those hypotheses. In this way, in silico models become a scientific instrument for investigation, and so should be developed to high standards, be carefully calibrated and their findings presented in such that they may be reproduced. Here, we outline a framework that supports developing simulations as scientific instruments, and we select cancer systems biology as an exemplar domain, with a particular focus on cellular signalling models. We consider the challenges of lack of data, incomplete knowledge and modelling in the context of a rapidly changing knowledge base. Our framework comprises a process to clearly separate scientific and engineering concerns in model and simulation development, and an argumentation approach to documenting models for rigorous way of recording assumptions and knowledge gaps. We propose interactive, dynamic visualisation tools to enable the biological community to interact with cellular signalling models directly for experimental design. There is a mismatch in scale between these cellular models and tissue structures that are affected by tumours, and bridging this gap requires substantial computational resource. We present concurrent programming as a technology to link scales without losing important details through model simplification. We discuss the value of combining this technology, interactive visualisation, argumentation and model separation to support development of multi-scale models that represent biologically plausible cells arranged in biologically plausible structures that model cell behaviour, interactions and response to therapeutic interventions
A Parallel Mesh-Adaptive Framework for Hyperbolic Conservation Laws
We report on the development of a computational framework for the parallel,
mesh-adaptive solution of systems of hyperbolic conservation laws like the
time-dependent Euler equations in compressible gas dynamics or
Magneto-Hydrodynamics (MHD) and similar models in plasma physics. Local mesh
refinement is realized by the recursive bisection of grid blocks along each
spatial dimension, implemented numerical schemes include standard
finite-differences as well as shock-capturing central schemes, both in
connection with Runge-Kutta type integrators. Parallel execution is achieved
through a configurable hybrid of POSIX-multi-threading and MPI-distribution
with dynamic load balancing. One- two- and three-dimensional test computations
for the Euler equations have been carried out and show good parallel scaling
behavior. The Racoon framework is currently used to study the formation of
singularities in plasmas and fluids.Comment: late submissio
Concurrence-Aware Long Short-Term Sub-Memories for Person-Person Action Recognition
Recently, Long Short-Term Memory (LSTM) has become a popular choice to model
individual dynamics for single-person action recognition due to its ability of
modeling the temporal information in various ranges of dynamic contexts.
However, existing RNN models only focus on capturing the temporal dynamics of
the person-person interactions by naively combining the activity dynamics of
individuals or modeling them as a whole. This neglects the inter-related
dynamics of how person-person interactions change over time. To this end, we
propose a novel Concurrence-Aware Long Short-Term Sub-Memories (Co-LSTSM) to
model the long-term inter-related dynamics between two interacting people on
the bounding boxes covering people. Specifically, for each frame, two
sub-memory units store individual motion information, while a concurrent LSTM
unit selectively integrates and stores inter-related motion information between
interacting people from these two sub-memory units via a new co-memory cell.
Experimental results on the BIT and UT datasets show the superiority of
Co-LSTSM compared with the state-of-the-art methods
Multiple verification in computational modeling of bone pathologies
We introduce a model checking approach to diagnose the emerging of bone
pathologies. The implementation of a new model of bone remodeling in PRISM has
led to an interesting characterization of osteoporosis as a defective bone
remodeling dynamics with respect to other bone pathologies. Our approach allows
to derive three types of model checking-based diagnostic estimators. The first
diagnostic measure focuses on the level of bone mineral density, which is
currently used in medical practice. In addition, we have introduced a novel
diagnostic estimator which uses the full patient clinical record, here
simulated using the modeling framework. This estimator detects rapid (months)
negative changes in bone mineral density. Independently of the actual bone
mineral density, when the decrease occurs rapidly it is important to alarm the
patient and monitor him/her more closely to detect insurgence of other bone
co-morbidities. A third estimator takes into account the variance of the bone
density, which could address the investigation of metabolic syndromes, diabetes
and cancer. Our implementation could make use of different logical combinations
of these statistical estimators and could incorporate other biomarkers for
other systemic co-morbidities (for example diabetes and thalassemia). We are
delighted to report that the combination of stochastic modeling with formal
methods motivate new diagnostic framework for complex pathologies. In
particular our approach takes into consideration important properties of
biosystems such as multiscale and self-adaptiveness. The multi-diagnosis could
be further expanded, inching towards the complexity of human diseases. Finally,
we briefly introduce self-adaptiveness in formal methods which is a key
property in the regulative mechanisms of biological systems and well known in
other mathematical and engineering areas.Comment: In Proceedings CompMod 2011, arXiv:1109.104
????????? ??????????????? ?????? ??????????????? ?????? ???????????????
Department of Energy Engineering (Battery Science and Technology)The continuous throng in demand for high energy density rechargeable batteries innovatively drives technological development in cell design as well as electrochemically active materials. In that perspective metal-free batteries consisting of a flowing seawater as a cathode active material were introduced. However, the electrochemical performance of the seawater battery was restrained by NASICON (Na3Zr2Si2PO12) ceramic solid electrolyte. Here, we demonstrate a new class of fibrous nanomat hard-carbon (FNHC) anode/1D (one-dimensional) bucky paper (1DBP) cathode hybrid electrode architecture in seawater battery based on 1D building block-interweaved hetero-nanomat frameworks. Differently from conventional slurry-cast electrodes, exquisitely designed hybrid hetero-nanomat electrodes are fabricated through concurrent dual electrospraying and electrospinning for the anode, vacuum-assisted infiltration for the cathode. HC nanoparticles are closely embedded in the spatially reinforced polymeric nanofiber/CNT hetero-nanomat skeletons that play a crucial role in constructing 3D-bicontinuous ion/electron transport pathways and allow to eliminate heavy metallic aluminum foil current collectors. Eventually the FNHC/1DBP seawater full cell, driven by aforementioned physicochemical uniqueness, shows exceptional improvement in electrochemical performance (Energy density = 693 Wh kg-1), (Power density = 3341 W kg-1) removing strong stereotype of ceramic solid electrolyte, which beyond those achievable with innovative next generation battery technologies.ope
Modelling and Verification of Multiple UAV Mission Using SMV
Model checking has been used to verify the correctness of digital circuits,
security protocols, communication protocols, as they can be modelled by means
of finite state transition model. However, modelling the behaviour of hybrid
systems like UAVs in a Kripke model is challenging. This work is aimed at
capturing the behaviour of an UAV performing cooperative search mission into a
Kripke model, so as to verify it against the temporal properties expressed in
Computation Tree Logic (CTL). SMV model checker is used for the purpose of
model checking
Design Environments for Complex Systems
The paper describes an approach for modeling complex systems by hiding as much formal details as possible from the user, still allowing verification and simulation of the model. The interface is based on UML to make the environment available to the largest audience. To carry out analysis, verification and simulation we automatically extract process algebras specifications from UML models. The results of the analysis is then reflected back in the UML model by annotating diagrams. The formal model includes stochastic information to handle quantitative parameters. We present here the stochastic -calculus and we discuss the implementation of its probabilistic support that allows simulation of processes. We exploit the benefits of our approach in two applicative domains: global computing and systems biology
BlenX-based compositional modeling of complex reaction mechanisms
Molecular interactions are wired in a fascinating way resulting in complex
behavior of biological systems. Theoretical modeling provides a useful
framework for understanding the dynamics and the function of such networks. The
complexity of the biological networks calls for conceptual tools that manage
the combinatorial explosion of the set of possible interactions. A suitable
conceptual tool to attack complexity is compositionality, already successfully
used in the process algebra field to model computer systems. We rely on the
BlenX programming language, originated by the beta-binders process calculus, to
specify and simulate high-level descriptions of biological circuits. The
Gillespie's stochastic framework of BlenX requires the decomposition of
phenomenological functions into basic elementary reactions. Systematic
unpacking of complex reaction mechanisms into BlenX templates is shown in this
study. The estimation/derivation of missing parameters and the challenges
emerging from compositional model building in stochastic process algebras are
discussed. A biological example on circadian clock is presented as a case study
of BlenX compositionality
- …