33,346 research outputs found
DeepCare: A Deep Dynamic Memory Model for Predictive Medicine
Personalized predictive medicine necessitates the modeling of patient illness
and care processes, which inherently have long-term temporal dependencies.
Healthcare observations, recorded in electronic medical records, are episodic
and irregular in time. We introduce DeepCare, an end-to-end deep dynamic neural
network that reads medical records, stores previous illness history, infers
current illness states and predicts future medical outcomes. At the data level,
DeepCare represents care episodes as vectors in space, models patient health
state trajectories through explicit memory of historical records. Built on Long
Short-Term Memory (LSTM), DeepCare introduces time parameterizations to handle
irregular timed events by moderating the forgetting and consolidation of memory
cells. DeepCare also incorporates medical interventions that change the course
of illness and shape future medical risk. Moving up to the health state level,
historical and present health states are then aggregated through multiscale
temporal pooling, before passing through a neural network that estimates future
outcomes. We demonstrate the efficacy of DeepCare for disease progression
modeling, intervention recommendation, and future risk prediction. On two
important cohorts with heavy social and economic burden -- diabetes and mental
health -- the results show improved modeling and risk prediction accuracy.Comment: Accepted at JBI under the new name: "Predicting healthcare
trajectories from medical records: A deep learning approach
A Comprehensive X-ray Absorption Model for Atomic Oxygen
An analytical formula is developed to represent accurately the
photoabsorption cross section of O I for all energies of interest in X-ray
spectral modeling. In the vicinity of the Kedge, a Rydberg series expression is
used to fit R-matrix results, including important orbital relaxation effects,
that accurately predict the absorption oscillator strengths below threshold and
merge consistently and continuously to the above-threshold cross section.
Further minor adjustments are made to the threshold energies in order to
reliably align the atomic Rydberg resonances after consideration of both
experimental and observed line positions. At energies far below or above the
K-edge region, the formulation is based on both outer- and inner-shell direct
photoionization, including significant shake-up and shake-off processes that
result in photoionization-excitation and double photoionization contributions
to the total cross section. The ultimate purpose for developing a definitive
model for oxygen absorption is to resolve standing discrepancies between the
astronomically observed and laboratory measured line positions, and between the
inferred atomic and molecular oxygen abundances in the interstellar medium from
XSTAR and SPEX spectral models
Formal Verification of Probabilistic SystemC Models with Statistical Model Checking
Transaction-level modeling with SystemC has been very successful in
describing the behavior of embedded systems by providing high-level executable
models, in which many of them have inherent probabilistic behaviors, e.g.,
random data and unreliable components. It thus is crucial to have both
quantitative and qualitative analysis of the probabilities of system
properties. Such analysis can be conducted by constructing a formal model of
the system under verification and using Probabilistic Model Checking (PMC).
However, this method is infeasible for large systems, due to the state space
explosion. In this article, we demonstrate the successful use of Statistical
Model Checking (SMC) to carry out such analysis directly from large SystemC
models and allow designers to express a wide range of useful properties. The
first contribution of this work is a framework to verify properties expressed
in Bounded Linear Temporal Logic (BLTL) for SystemC models with both timed and
probabilistic characteristics. Second, the framework allows users to expose a
rich set of user-code primitives as atomic propositions in BLTL. Moreover,
users can define their own fine-grained time resolution rather than the
boundary of clock cycles in the SystemC simulation. The third contribution is
an implementation of a statistical model checker. It contains an automatic
monitor generation for producing execution traces of the
model-under-verification (MUV), the mechanism for automatically instrumenting
the MUV, and the interaction with statistical model checking algorithms.Comment: Journal of Software: Evolution and Process. Wiley, 2017. arXiv admin
note: substantial text overlap with arXiv:1507.0818
Compositional Performance Modelling with the TIPPtool
Stochastic process algebras have been proposed as compositional specification formalisms for performance models. In this paper, we describe a tool which aims at realising all beneficial aspects of compositional performance modelling, the TIPPtool. It incorporates methods for compositional specification as well as solution, based on state-of-the-art techniques, and wrapped in a user-friendly graphical front end. Apart from highlighting the general benefits of the tool, we also discuss some lessons learned during development and application of the TIPPtool. A non-trivial model of a real life communication system serves as a case study to illustrate benefits and limitations
Modulation of neutral interstellar He, Ne, O in the heliosphere. Survival probabilities and abundances at IBEX
Direct sampling of neutral interstellar (NIS) atoms by the Interstellar
Boundary Explorer (IBEX) can potentially provide a complementary method for
studying element abundances in the Local Interstellar Cloud and processes in
the heliosphere interface.}{We set the stage for abundance-aimed in-depth
analysis of measurements of NIS He, Ne, and O by IBEX and determine systematic
differences between abundances derived from various calculation methods and
their uncertainties.}{Using a model of ionization rates of the NIS species in
the heliosphere, based on independent measurements of the solar wind and solar
EUV radiation, we develop a time-dependent method of calculating the survival
probabilities of NIS atoms from the termination shock (TS) of the solar wind to
IBEX. With them, we calculate densities of these species along the Earth's
orbit and simulate the fluxes of NIS species as observed by IBEX. We study
pairwise ratios of survival probabilities, densities and fluxes of NIS species
at IBEX to calculate correction factors for inferring the abundances at
TS.}{The analytic method to calculate the survival probabilities gives
acceptable results only for He and Ne during low solar activity. For the
remaining portions of the solar cycle, and at all times for O, a fully time
dependent model should be used. Electron impact ionization is surprisingly
important for NIS O. Interpreting the IBEX observations using the time
dependent model yields the LIC Ne/O abundance of . The uncertainty
is mostly due to uncertainties in the ionization rates and in the NIS gas flow
vector.}{The Ne/He, O/He and Ne/O ratios for survival probabilities, local
densities, and fluxes scaled to TS systematically differ and thus an analysis
based only on survival probabilities or densities is not recommended, except
the Ne/O abundance for observations at low solar activity.Comment: Astronomy & Astrophysics, in press. Language and editing corrections
implemente
Setting Parameters for Biological Models With ANIMO
ANIMO (Analysis of Networks with Interactive MOdeling) is a software for
modeling biological networks, such as e.g. signaling, metabolic or gene
networks. An ANIMO model is essentially the sum of a network topology and a
number of interaction parameters. The topology describes the interactions
between biological entities in form of a graph, while the parameters determine
the speed of occurrence of such interactions. When a mismatch is observed
between the behavior of an ANIMO model and experimental data, we want to update
the model so that it explains the new data. In general, the topology of a model
can be expanded with new (known or hypothetical) nodes, and enables it to match
experimental data. However, the unrestrained addition of new parts to a model
causes two problems: models can become too complex too fast, to the point of
being intractable, and too many parts marked as "hypothetical" or "not known"
make a model unrealistic. Even if changing the topology is normally the easier
task, these problems push us to try a better parameter fit as a first step, and
resort to modifying the model topology only as a last resource. In this paper
we show the support added in ANIMO to ease the task of expanding the knowledge
on biological networks, concentrating in particular on the parameter settings
Respiratory, postural and spatio-kinetic motor stabilization, internal models, top-down timed motor coordination and expanded cerebello-cerebral circuitry: a review
Human dexterity, bipedality, and song/speech vocalization in Homo are reviewed within a motor evolution perspective in regard to 

(i) brain expansion in cerebello-cerebral circuitry, 
(ii) enhanced predictive internal modeling of body kinematics, body kinetics and action organization, 
(iii) motor mastery due to prolonged practice, 
(iv) task-determined top-down, and accurately timed feedforward motor adjustment of multiple-body/artifact elements, and 
(v) reduction in automatic preflex/spinal reflex mechanisms that would otherwise restrict such top-down processes. 

Dual-task interference and developmental neuroimaging research argues that such internal modeling based motor capabilities are concomitant with the evolution of 
(vi) enhanced attentional, executive function and other high-level cognitive processes, and that 
(vii) these provide dexterity, bipedality and vocalization with effector nonspecific neural resources. 

The possibility is also raised that such neural resources could 
(viii) underlie human internal model based nonmotor cognitions. 

- âŠ