1,600 research outputs found
Nonparametric Estimation of the Cumulative Intensity Function for a Nonhomogeneous Poisson Process from Overlapping Realizations
A nonparametric technique for estimating the cumulative intensity function of a nonhomogeneous Poisson process from one or more realizations on an interval is extended here to include realizations that overlap. This technique does not require any arbitrary parameters from the modeler, and the estimated cumulative intensity function can be used to generate a point process for simulation by inversion
Adaptive finite element method assisted by stochastic simulation of chemical systems
Stochastic models of chemical systems are often analysed by solving the corresponding\ud
Fokker-Planck equation which is a drift-diffusion partial differential equation for the probability\ud
distribution function. Efficient numerical solution of the Fokker-Planck equation requires adaptive mesh refinements. In this paper, we present a mesh refinement approach which makes use of a stochastic simulation of the underlying chemical system. By observing the stochastic trajectory for a relatively short amount of time, the areas of the state space with non-negligible probability density are identified. By refining the finite element mesh in these areas, and coarsening elsewhere, a suitable mesh is constructed and used for the computation of the probability density
Lambda-prophage induction modeled as a cooperative failure mode of lytic repression
We analyze a system-level model for lytic repression of lambda-phage in E.
coli using reliability theory, showing that the repressor circuit comprises 4
redundant components whose failure mode is prophage induction. Our model
reflects the specific biochemical mechanisms involved in regulation, including
long-range cooperative binding, and its detailed predictions for prophage
induction in E. coli under ultra-violet radiation are in good agreement with
experimental data.Comment: added referenc
When Can You Fold a Map?
We explore the following problem: given a collection of creases on a piece of
paper, each assigned a folding direction of mountain or valley, is there a flat
folding by a sequence of simple folds? There are several models of simple
folds; the simplest one-layer simple fold rotates a portion of paper about a
crease in the paper by +-180 degrees. We first consider the analogous questions
in one dimension lower -- bending a segment into a flat object -- which lead to
interesting problems on strings. We develop efficient algorithms for the
recognition of simply foldable 1D crease patterns, and reconstruction of a
sequence of simple folds. Indeed, we prove that a 1D crease pattern is
flat-foldable by any means precisely if it is by a sequence of one-layer simple
folds.
Next we explore simple foldability in two dimensions, and find a surprising
contrast: ``map'' folding and variants are polynomial, but slight
generalizations are NP-complete. Specifically, we develop a linear-time
algorithm for deciding foldability of an orthogonal crease pattern on a
rectangular piece of paper, and prove that it is (weakly) NP-complete to decide
foldability of (1) an orthogonal crease pattern on a orthogonal piece of paper,
(2) a crease pattern of axis-parallel and diagonal (45-degree) creases on a
square piece of paper, and (3) crease patterns without a mountain/valley
assignment.Comment: 24 pages, 19 figures. Version 3 includes several improvements thanks
to referees, including formal definitions of simple folds, more figures,
table summarizing results, new open problems, and additional reference
Experimental Biological Protocols with Formal Semantics
Both experimental and computational biology is becoming increasingly
automated. Laboratory experiments are now performed automatically on
high-throughput machinery, while computational models are synthesized or
inferred automatically from data. However, integration between automated tasks
in the process of biological discovery is still lacking, largely due to
incompatible or missing formal representations. While theories are expressed
formally as computational models, existing languages for encoding and
automating experimental protocols often lack formal semantics. This makes it
challenging to extract novel understanding by identifying when theory and
experimental evidence disagree due to errors in the models or the protocols
used to validate them. To address this, we formalize the syntax of a core
protocol language, which provides a unified description for the models of
biochemical systems being experimented on, together with the discrete events
representing the liquid-handling steps of biological protocols. We present both
a deterministic and a stochastic semantics to this language, both defined in
terms of hybrid processes. In particular, the stochastic semantics captures
uncertainties in equipment tolerances, making it a suitable tool for both
experimental and computational biologists. We illustrate how the proposed
protocol language can be used for automated verification and synthesis of
laboratory experiments on case studies from the fields of chemistry and
molecular programming
3D Geometric Analysis of Tubular Objects based on Surface Normal Accumulation
This paper proposes a simple and efficient method for the reconstruction and
extraction of geometric parameters from 3D tubular objects. Our method
constructs an image that accumulates surface normal information, then peaks
within this image are located by tracking. Finally, the positions of these are
optimized to lie precisely on the tubular shape centerline. This method is very
versatile, and is able to process various input data types like full or partial
mesh acquired from 3D laser scans, 3D height map or discrete volumetric images.
The proposed algorithm is simple to implement, contains few parameters and can
be computed in linear time with respect to the number of surface faces. Since
the extracted tube centerline is accurate, we are able to decompose the tube
into rectilinear parts and torus-like parts. This is done with a new linear
time 3D torus detection algorithm, which follows the same principle of a
previous work on 2D arc circle recognition. Detailed experiments show the
versatility, accuracy and robustness of our new method.Comment: in 18th International Conference on Image Analysis and Processing,
Sep 2015, Genova, Italy. 201
Recommended from our members
Promoting tau secretion and propagation by hyperactive p300/CBP via autophagy-lysosomal pathway in tauopathy.
BackgroundThe trans-neuronal propagation of tau has been implicated in the progression of tau-mediated neurodegeneration. There is critical knowledge gap in understanding how tau is released and transmitted, and how that is dysregulated in diseases. Previously, we reported that lysine acetyltransferase p300/CBP acetylates tau and regulates its degradation and toxicity. However, whether p300/CBP is involved in regulation of tau secretion and propagation is unknown.MethodWe investigated the relationship between p300/CBP activity, the autophagy-lysosomal pathway (ALP) and tau secretion in mouse models of tauopathy and in cultured rodent and human neurons. Through a high-through-put compound screen, we identified a new p300 inhibitor that promotes autophagic flux and reduces tau secretion. Using fibril-induced tau spreading models in vitro and in vivo, we examined how p300/CBP regulates tau propagation.ResultsIncreased p300/CBP activity was associated with aberrant accumulation of ALP markers in a tau transgenic mouse model. p300/CBP hyperactivation blocked autophagic flux and increased tau secretion in neurons. Conversely, inhibiting p300/CBP promoted autophagic flux, reduced tau secretion, and reduced tau propagation in fibril-induced tau spreading models in vitro and in vivo.ConclusionsWe report that p300/CBP, a lysine acetyltransferase aberrantly activated in tauopathies, causes impairment in ALP, leading to excess tau secretion. This effect, together with increased intracellular tau accumulation, contributes to enhanced spreading of tau. Our findings suggest that inhibition of p300/CBP as a novel approach to correct ALP dysfunction and block disease progression in tauopathy
Modeling Somatic Evolution in Tumorigenesis
Tumorigenesis in humans is thought to be a multistep process where certain mutations confer a selective advantage, allowing lineages derived from the mutated cell to outcompete other cells. Although molecular cell biology has substantially advanced cancer research, our understanding of the evolutionary dynamics that govern tumorigenesis is limited. This paper analyzes the computational implications of cancer progression presented by Hanahan and Weinberg in The Hallmarks of Cancer. We model the complexities of tumor progression as a small set of underlying rules that govern the transformation of normal cells to tumor cells. The rules are implemented in a stochastic multistep model. The model predicts that (i) early-onset cancers proceed through a different sequence of mutation acquisition than late-onset cancers; (ii) tumor heterogeneity varies with acquisition of genetic instability, mutation pathway, and selective pressures during tumorigenesis; (iii) there exists an optimal initial telomere length which lowers cancer incidence and raises time of cancer onset; and (iv) the ability to initiate angiogenesis is an important stage-setting mutation, which is often exploited by other cells. The model offers insight into how the sequence of acquired mutations affects the timing and cellular makeup of the resulting tumor and how the cellular-level population dynamics drive neoplastic evolution
- …