87 research outputs found
On small-noise equations with degenerate limiting system arising from volatility models
The one-dimensional SDE with non Lipschitz diffusion coefficient is widely
studied in mathematical finance. Several works have proposed asymptotic
analysis of densities and implied volatilities in models involving instances of
this equation, based on a careful implementation of saddle-point methods and
(essentially) the explicit knowledge of Fourier transforms. Recent research on
tail asymptotics for heat kernels [J-D. Deuschel, P.~Friz, A.~Jacquier, and
S.~Violante. Marginal density expansions for diffusions and stochastic
volatility, part II: Applications. 2013, arxiv:1305.6765] suggests to work with
the rescaled variable : while
allowing to turn a space asymptotic problem into a small- problem
with fixed terminal point, the process satisfies a SDE in
Wentzell--Freidlin form (i.e. with driving noise ). We prove a
pathwise large deviation principle for the process as
. As it will become clear, the limiting ODE governing the
large deviations admits infinitely many solutions, a non-standard situation in
the Wentzell--Freidlin theory. As for applications, the -scaling
allows to derive exact log-asymptotics for path functionals of the process:
while on the one hand the resulting formulae are confirmed by the CIR-CEV
benchmarks, on the other hand the large deviation approach (i) applies to
equations with a more general drift term and (ii) potentially opens the way to
heat kernel analysis for higher-dimensional diffusions involving such an SDE as
a component.Comment: 21 pages, 1 figur
Landscape statistics of the low autocorrelated binary string problem
The statistical properties of the energy landscape of the low autocorrelated
binary string problem (LABSP) are studied numerically and compared with those
of several classic disordered models. Using two global measures of landscape
structure which have been introduced in the Simulated Annealing literature,
namely, depth and difficulty, we find that the landscape of LABSP, except
perhaps for a very large degeneracy of the local minima energies, is
qualitatively similar to some well-known landscapes such as that of the
mean-field 2-spin glass model. Furthermore, we consider a mean-field
approximation to the pure model proposed by Bouchaud and Mezard (1994, J.
Physique I France 4 1109) and show both analytically and numerically that it
describes extremely well the statistical properties of LABSP
Large Deviations for Stochastic Evolution Equations with Small Multiplicative Noise
The Freidlin-Wentzell large deviation principle is established for the
distributions of stochastic evolution equations with general monotone drift and
small multiplicative noise. As examples, the main results are applied to derive
the large deviation principle for different types of SPDE such as stochastic
reaction-diffusion equations, stochastic porous media equations and fast
diffusion equations, and the stochastic p-Laplace equation in Hilbert space.
The weak convergence approach is employed in the proof to establish the Laplace
principle, which is equivalent to the large deviation principle in our
framework.Comment: 31 pages, published in Appl. Math. Opti
Beyond the Fokker-Planck equation: Pathwise control of noisy bistable systems
We introduce a new method, allowing to describe slowly time-dependent
Langevin equations through the behaviour of individual paths. This approach
yields considerably more information than the computation of the probability
density. The main idea is to show that for sufficiently small noise intensity
and slow time dependence, the vast majority of paths remain in small space-time
sets, typically in the neighbourhood of potential wells. The size of these sets
often has a power-law dependence on the small parameters, with universal
exponents. The overall probability of exceptional paths is exponentially small,
with an exponent also showing power-law behaviour. The results cover time spans
up to the maximal Kramers time of the system. We apply our method to three
phenomena characteristic for bistable systems: stochastic resonance, dynamical
hysteresis and bifurcation delay, where it yields precise bounds on transition
probabilities, and the distribution of hysteresis areas and first-exit times.
We also discuss the effect of coloured noise.Comment: 37 pages, 11 figure
A constructive approach for discovering new drug leads: Using a kernel methodology for the inverse-QSAR problem
<p>Abstract</p> <p>Background</p> <p>The inverse-QSAR problem seeks to find a new molecular descriptor from which one can recover the structure of a molecule that possess a desired activity or property. Surprisingly, there are very few papers providing solutions to this problem. It is a difficult problem because the molecular descriptors involved with the inverse-QSAR algorithm must adequately address the forward QSAR problem for a given biological activity if the subsequent recovery phase is to be meaningful. In addition, one should be able to construct a feasible molecule from such a descriptor. The difficulty of recovering the molecule from its descriptor is the major limitation of most inverse-QSAR methods.</p> <p>Results</p> <p>In this paper, we describe the reversibility of our previously reported descriptor, the vector space model molecular descriptor (VSMMD) based on a vector space model that is suitable for kernel studies in QSAR modeling. Our inverse-QSAR approach can be described using five steps: (1) generate the VSMMD for the compounds in the training set; (2) map the VSMMD in the input space to the kernel feature space using an appropriate kernel function; (3) design or generate a new point in the kernel feature space using a kernel feature space algorithm; (4) map the feature space point back to the input space of descriptors using a pre-image approximation algorithm; (5) build the molecular structure template using our VSMMD molecule recovery algorithm.</p> <p>Conclusion</p> <p>The empirical results reported in this paper show that our strategy of using kernel methodology for an inverse-Quantitative Structure-Activity Relationship is sufficiently powerful to find a meaningful solution for practical problems.</p
Crowdsourced assessment of common genetic contribution to predicting anti-TNF treatment response in rheumatoid arthritis
Rheumatoid arthritis (RA) affects millions world-wide. While anti-TNF treatment is widely used to reduce disease progression, treatment fails in Bone-third of patients. No biomarker currently exists that identifies non-responders before treatment. A rigorous community-based assessment of the utility of SNP data for predicting anti-TNF treatment efficacy in RA patients was performed in the context of a DREAM Challenge (http://www.synapse.org/RA_Challenge). An open challenge framework enabled the comparative evaluation of predictions developed by 73 research groups using the most comprehensive available data and covering a wide range of state-of-the-art modelling methodologies. Despite a significant genetic heritability estimate of treatment non-response trait (h(2) = 0.18, P value = 0.02), no significant genetic contribution to prediction accuracy is observed. Results formally confirm the expectations of the rheumatology community that SNP information does not significantly improve predictive performance relative to standard clinical traits, thereby justifying a refocusing of future efforts on collection of other data
Design and Development of a Vision Based Leather Trimming Machine
The objective of the work described in this paper is to demonstrate a laboratory prototype for trimming the external part of a hide, assuming that the resulting machine would eventually form part of a completely automatic system in which hides are uploaded, inspected and parts for assembly are downloaded without manual intervention and prior sorting. Detailed literature and international standards are included. The expected advantages of integrating all vision based functions in a single machine, whose basic architecture is proposed in the paper, are also discussed.
The developed system is based on a monochrome camera following the leather contour. This work fo-cuses on the image processing algorithms for defect detection on leather and the NC programming issues related to the path following optimization, which have been successfully tested with different leather types
Beware of circularity: A critical assessment of the state of the art in deleteriousness prediction of missense variants
Discrimination between disease-causing missense mutations and neutral polymorphisms is a key challenge in current sequencing studies. It is there- fore critical to be able to evaluate fairly and without bias the performance of the many in silico predictors of deleteriousness. However, current analy- ses of such tools and their combinations are liable to suffer from the effects of circularity, which occurs when predictors are evaluated on data that are not independent from those that were used to build them, and may lead to overly optimistic results. Circularity can first stem from the overlap between training and evaluation datasets, which may result in the well-studied phe- nomenon of overfitting: a tool that is too tailored to a given dataset will be more likely than others to perform well on that set, but incurs the risk of failing more heavily at classifying novel variants. Second, we find that circu- larity may result from an investigation bias in the way mutation databases are populated: in most cases, all the variants of the same protein are anno- tated with the same (neutral or pathogenic) status. Furthermore, proteins containing only deleterious SNVs comprise many more labeled variants than their counterparts containing only neutral SNVs. Ignoring this, we find that assigning a variant the same status as that of its closest variant on the genomic sequence outperforms all state-of-the-art tools. Given these barriers to valid assessment of the performance of deleteriousness predic- tion tools, we employ approaches that avoid circularity, and hence provide independent evaluation of ten state-of-the-art tools and their combinations. Our detailed analysis provides scientists with critical insights to guide their choice of tool as well as the future development of new methods for deleter- iousness prediction. In particular, we demonstrate that the performance of FatHMM-W relies mostly on the knowledge of the labels of neighboring variants, which may hinder its ability to annotate variants in the less explored regions of the genome. We also find that PolyPhen2 performs as well or better than all other tools at discriminating between cases and controls in a novel autism-relevant dataset. Based on our findings about the mutation databases available for training deleteriousness prediction tools, we predict that retraining PolyPhen2 features on the Varibench dataset will yield even better performance, and we show that this is true for the autism-relevant dataset
- …