954 research outputs found
Goal-oriented sensitivity analysis for lattice kinetic Monte Carlo simulations
In this paper we propose a new class of coupling methods for the sensitivity
analysis of high dimensional stochastic systems and in particular for lattice
Kinetic Monte Carlo. Sensitivity analysis for stochastic systems is typically
based on approximating continuous derivatives with respect to model parameters
by the mean value of samples from a finite difference scheme. Instead of using
independent samples the proposed algorithm reduces the variance of the
estimator by developing a strongly correlated-"coupled"- stochastic process for
both the perturbed and unperturbed stochastic processes, defined in a common
state space. The novelty of our construction is that the new coupled process
depends on the targeted observables, e.g. coverage, Hamiltonian, spatial
correlations, surface roughness, etc., hence we refer to the proposed method as
em goal-oriented sensitivity analysis. In particular, the rates of the coupled
Continuous Time Markov Chain are obtained as solutions to a goal-oriented
optimization problem, depending on the observable of interest, by considering
the minimization functional of the corresponding variance. We show that this
functional can be used as a diagnostic tool for the design and evaluation of
different classes of couplings. Furthermore the resulting KMC sensitivity
algorithm has an easy implementation that is based on the Bortz-Kalos-Lebowitz
algorithm's philosophy, where here events are divided in classes depending on
level sets of the observable of interest. Finally, we demonstrate in several
examples including adsorption, desorption and diffusion Kinetic Monte Carlo
that for the same confidence interval and observable, the proposed
goal-oriented algorithm can be two orders of magnitude faster than existing
coupling algorithms for spatial KMC such as the Common Random Number approach
Relational reasoning via probabilistic coupling
Probabilistic coupling is a powerful tool for analyzing pairs of
probabilistic processes. Roughly, coupling two processes requires finding an
appropriate witness process that models both processes in the same probability
space. Couplings are powerful tools proving properties about the relation
between two processes, include reasoning about convergence of distributions and
stochastic dominance---a probabilistic version of a monotonicity property.
While the mathematical definition of coupling looks rather complex and
cumbersome to manipulate, we show that the relational program logic pRHL---the
logic underlying the EasyCrypt cryptographic proof assistant---already
internalizes a generalization of probabilistic coupling. With this insight,
constructing couplings is no harder than constructing logical proofs. We
demonstrate how to express and verify classic examples of couplings in pRHL,
and we mechanically verify several couplings in EasyCrypt
Common Carp Disrupt Ecosystem Structure and Function Through Middle-out Effects
Middle-out effects or a combination of top-down and bottom-up processes create many theoretical and empirical challenges in the realm of trophic ecology. We propose using specific autecology or species trait (i.e. behavioural) information to help explain and understand trophic dynamics that may involve complicated and nonunidirectional trophic interactions. The common carp (Cyprinus carpio) served as our model species for whole-lake observational and experimental studies; four trophic levels were measured to assess common carp-mediated middle-out effects across multiple lakes. We hypothesised that common carp could influence aquatic ecosystems through multiple pathways (i.e. abiotic and biotic foraging, early life feeding, nutrient). Both studies revealed most trophic levels were affected by common carp, highlighting strong middle-out effects likely caused by common carp foraging activities and abiotic influence (i.e. sediment resuspension). The loss of water transparency, submersed vegetation and a shift in zooplankton dynamics were the strongest effects. Trophic levels furthest from direct pathway effects were also affected (fish life history traits). The present study demonstrates that common carp can exert substantial effects on ecosystem structure and function. Species capable of middle-out effects can greatly modify communities through a variety of available pathways and are not confined to traditional top-down or bottom-up processes
Epigenetic alterations in skin homing CD4+CLA+ T cells of atopic dermatitis patients
T cells expressing the cutaneous lymphocyte antigen (CLA) mediate pathogenic inflammation in atopic dermatitis (AD). The molecular alterations contributing to their dysregulation remain unclear. With the aim to elucidate putative altered pathways in AD we profiled DNA methylation levels and miRNA expression in sorted T cell populations (CD4+, CD4+CD45RA+ naïve, CD4+CLA+, and CD8+) from adult AD patients and healthy controls (HC). Skin homing CD4+CLA+ T cells from AD patients showed significant differences in DNA methylation in 40 genes compared to HC (p < 0.05). Reduced DNA methylation levels in the upstream region of the interleukin-13 gene (IL13) in CD4+CLA+ T cells from AD patients correlated with increased IL13 mRNA expression in these cells. Sixteen miRNAs showed differential expression in CD4+CLA+ T cells from AD patients targeting genes in 202 biological processes (p < 0.05). An integrated network analysis of miRNAs and CpG sites identified two communities of strongly interconnected regulatory elements with strong antagonistic behaviours that recapitulated the differences between AD patients and HC. Functional analysis of the genes linked to these communities revealed their association with key cytokine signaling pathways, MAP kinase signaling and protein ubiquitination. Our findings support that epigenetic mechanisms play a role in the pathogenesis of AD by affecting inflammatory signaling molecules in skin homing CD4+CLA+ T cells and uncover putative molecules participating in AD pathways. © 2020, The Author(s).Peer reviewe
Clonal human fetal ventral mesencephalic dopaminergic neuron precursors for cell therapy research
A major challenge for further development of drug screening procedures, cell replacement therapies and developmental studies is the identification of expandable human stem cells able to generate the cell types needed. We have previously reported the generation of an immortalized polyclonal neural stem cell (NSC) line derived from the human fetal ventral mesencephalon (hVM1). This line has been biochemically, genetically, immunocytochemically and electrophysiologically characterized to document its usefulness as a model system for the generation of A9 dopaminergic neurons (DAn). Long-term in vivo transplantation studies in parkinsonian rats showed that the grafts do not mature evenly. We reasoned that diverse clones in the hVM1 line might have different abilities to differentiate. In the present study, we have analyzed 9 hVM1 clones selected on the basis of their TH generation potential and, based on the number of v-myc copies, v-myc down-regulation after in vitro differentiation, in vivo cell cycle exit, TH+ neuron generation and expression of a neuronal mature marker (hNSE), we selected two clones for further in vivo PD cell replacement studies. The conclusion is that homogeneity and clonality of characterized NSCs allow transplantation of cells with controlled properties, which should help in the design of long-term in vivo experimentsThis work was supported by grants from the Spanish Ministry of Economy and Competitiveness (formerly Science and Innovation; PLE2009-0101,
SAF2010-17167), Comunidad AutĂłnoma Madrid (S2011-BMD-2336), Instituto Salud Carlos III (RETICS TerCel, RD06/0010/0009) and European Union (Excell, NMP4-SL-2008-214706). This work was also supported by an institutional grant from Foundation RamĂłn Areces to the Center of Molecular Biology Severo Ocho
Determination of the Bending Rigidity of Graphene via Electrostatic Actuation of Buckled Membranes
The small mass and atomic-scale thickness of graphene membranes make them
highly suitable for nanoelectromechanical devices such as e.g. mass sensors,
high frequency resonators or memory elements. Although only atomically thick,
many of the mechanical properties of graphene membranes can be described by
classical continuum mechanics. An important parameter for predicting the
performance and linearity of graphene nanoelectromechanical devices as well as
for describing ripple formation and other properties such as electron
scattering mechanisms, is the bending rigidity, {\kappa}. In spite of the
importance of this parameter it has so far only been estimated indirectly for
monolayer graphene from the phonon spectrum of graphite, estimated from AFM
measurements or predicted from ab initio calculations or bond-order potential
models. Here, we employ a new approach to the experimental determination of
{\kappa} by exploiting the snap-through instability in pre-buckled graphene
membranes. We demonstrate the reproducible fabrication of convex buckled
graphene membranes by controlling the thermal stress during the fabrication
procedure and show the abrupt switching from convex to concave geometry that
occurs when electrostatic pressure is applied via an underlying gate electrode.
The bending rigidity of bilayer graphene membranes under ambient conditions was
determined to be eV. Monolayers have significantly lower
{\kappa} than bilayers
Some Findings Concerning Requirements in Agile Methodologies
gile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies
Scaling Limits for Internal Aggregation Models with Multiple Sources
We study the scaling limits of three different aggregation models on Z^d:
internal DLA, in which particles perform random walks until reaching an
unoccupied site; the rotor-router model, in which particles perform
deterministic analogues of random walks; and the divisible sandpile, in which
each site distributes its excess mass equally among its neighbors. As the
lattice spacing tends to zero, all three models are found to have the same
scaling limit, which we describe as the solution to a certain PDE free boundary
problem in R^d. In particular, internal DLA has a deterministic scaling limit.
We find that the scaling limits are quadrature domains, which have arisen
independently in many fields such as potential theory and fluid dynamics. Our
results apply both to the case of multiple point sources and to the
Diaconis-Fulton smash sum of domains.Comment: 74 pages, 4 figures, to appear in J. d'Analyse Math. Main changes in
v2: added "least action principle" (Lemma 3.2); small corrections in section
4, and corrected the proof of Lemma 5.3 (Lemma 5.4 in the new version);
expanded section 6.
Monitoring of northern climate exposure
Currently, facility managers are faced with many advanced decisions
regarding when and how to inspect, maintain, repair or renew existing facilities in a costeffective manner. The evolution of the deteriorations of road structures in reinforced
concrete depends on the exposure of the elements to water in liquid form or vapour and to
other aggressive agents such as chloride. Current models of ionic transport neglect the
effect of real ionic concentration in contact with concrete structures, it means boundary
conditions are considered with simple tendency as uniform concentration during the winter
period and model parameters are derived from the fitting method. Therefore, it implies in
ineffective prediction models of deterioration, i.e. steel rebar corrosion by chloride
presence or carbonation, alkali-granular reaction, acid attacks, etc. Structure are sensitive
to their environment and their interaction with it is directly related to the processes of
deterioration. The degradation of structures exposed to salt-laden mist is faster in the wetter
areas. On the contrary, the deterioration of the structures caused by salt spray in the drier
zone is slower. The structures, exposed to splashing (precipitation, wind, splash, etc.), have
a slower rate of degradation in the wetter regions. The amount of rain has an indirect effect
in the process of deterioration of the structure exposed to salt-laden mist because it changes
the contact time of chloride on the surface of the structures. For this purpose, a unique
exposure monitoring was developed. This mobile station, named MExStUL, contains an
atmospheric sensor and new possibilities of chloride detection contained in splashes, mist
and static water near the road improving the real exposure of structure and the boundary
conditions. First results highlight the real influence of environmental parameters on
structures durability on highways. Salt concentration is not uniform during winter period
and water thickness demonstrate important periods of drying
- âŠ