328 research outputs found

    Adaptive importance sampling technique for markov chains using stochastic approximation

    Get PDF
    For a discrete-time finite-state Markov chain, we develop an adaptive importance sampling scheme to estimate the expected total cost before hitting a set of terminal states. This scheme updates the change of measure at every transition using constant or decreasing step-size stochastic approximation. The updates are shown to concentrate asymptotically in a neighborhood of the desired zero-variance estimator. Through simulation experiments on simple Markovian queues, we observe that the proposed technique performs very well in estimating performance measures related to rare events associated with queue lengths exceeding prescribed thresholds. We include performance comparisons of the proposed algorithm with existing adaptive importance sampling algorithms on some examples. We also discuss the extension of the technique to estimate the infinite horizon expected discounted cost and the expected average cost

    Turbulence Fluctuations and New Universal Realizability Conditions in Modelling

    Full text link
    General turbulent mean statistics are shown to be characterized by a variational principle. The variational functionals, or ``effective actions'', have experimental consequences for turbulence fluctuations and are subject to realizability conditions of positivity and convexity. An efficient Rayleigh-Ritz algorithm is available to calculate approximate effective actions within PDF closures. Examples are given for Navier-Stokes and for a 3-mode system of Lorenz. The new realizability conditions succeed at detecting {\em a priori} the poor predictions of PDF closures even when the classical 2nd-order moment realizability conditions are satisfied.Comment: 4 pages, LaTeX (Version 2.09), 3 figures, Postscript, Submitted to Phys. Rev. Let

    Fluctuations in the Irreversible Decay of Turbulent Energy

    Full text link
    A fluctuation law of the energy in freely-decaying, homogeneous and isotropic turbulence is derived within standard closure hypotheses for 3D incompressible flow. In particular, a fluctuation-dissipation relation is derived which relates the strength of a stochastic backscatter term in the energy decay equation to the mean of the energy dissipation rate. The theory is based on the so-called ``effective action'' of the energy history and illustrates a Rayleigh-Ritz method recently developed to evaluate the effective action approximately within probability density-function (PDF) closures. These effective actions generalize the Onsager-Machlup action of nonequilibrium statistical mechanics to turbulent flow. They yield detailed, concrete predictions for fluctuations, such as multi-time correlation functions of arbitrary order, which cannot be obtained by direct PDF methods. They also characterize the mean histories by a variational principle.Comment: 26 pages, Latex Version 2.09, plus seceq.sty, a stylefile for sequential numbering of equations by section. This version includes new discussion of the physical interpretation of the formal Rayleigh-Ritz approximation. The title is also change

    BioThings Explorer: a query engine for a federated knowledge graph of biomedical APIs

    Full text link
    Knowledge graphs are an increasingly common data structure for representing biomedical information. These knowledge graphs can easily represent heterogeneous types of information, and many algorithms and tools exist for querying and analyzing graphs. Biomedical knowledge graphs have been used in a variety of applications, including drug repurposing, identification of drug targets, prediction of drug side effects, and clinical decision support. Typically, knowledge graphs are constructed by centralization and integration of data from multiple disparate sources. Here, we describe BioThings Explorer, an application that can query a virtual, federated knowledge graph derived from the aggregated information in a network of biomedical web services. BioThings Explorer leverages semantically precise annotations of the inputs and outputs for each resource, and automates the chaining of web service calls to execute multi-step graph queries. Because there is no large, centralized knowledge graph to maintain, BioThing Explorer is distributed as a lightweight application that dynamically retrieves information at query time. More information can be found at https://explorer.biothings.io, and code is available at https://github.com/biothings/biothings_explorer

    A framework for Operational Security Metrics Development for industrial control environment

    Get PDF
    Security metrics are very crucial towards providing insights when measuring security states and susceptibilities in industrial operational environments. Obtaining practical security metrics depend on effective security metrics development approaches. To be effective, a security metrics development framework should be scope-definitive, objective-oriented, reliable, simple, adaptable, and repeatable (SORSAR). A framework for Operational Security Metrics Development (OSMD) for industry control environments is presented, which combines concepts and characteristics from existing approaches. It also adds the new characteristic of adaptability. The OSMD framework is broken down into three phases of: target definition, objective definition, and metrics synthesis. A case study scenario is used to demonstrate an instance of how to implement and apply the proposed framework to demonstrate its usability and workability. Expert elicitation has also be used to consolidate the validity of the proposed framework. Both validation approaches have helped to show that the proposed framework can help create effective and efficient ICS-centric security metrics taxonomy that can be used to evaluate capabilities or vulnerabilities. The understanding from this can help enhance security assurance within industrial operational environments

    Schroedinger equation for joint bidirectional motion in time

    Full text link
    The conventional, time-dependent Schroedinger equation describes only unidirectional time evolution of the state of a physical system, i.e., forward or, less commonly, backward. This paper proposes a generalized quantum dynamics for the description of joint, and interactive, forward and backward time evolution within a physical system. [...] Three applications are studied: (1) a formal theory of collisions in terms of perturbation theory; (2) a relativistically invariant quantum field theory for a system that kinematically comprises the direct sum of two quantized real scalar fields, such that one field evolves forward and the other backward in time, and such that there is dynamical coupling between the subfields; (3) an argument that in the latter field theory, the dynamics predicts that in a range of values of the coupling constants, the expectation value of the vacuum energy of the universe is forced to be zero to high accuracy. [...]Comment: 30 pages, no figures. Related material is in quant-ph/0404012. Differs from published version by a few added remarks on the possibility of a large-scale-average negative energy density in spac

    Systemic Risk and Default Clustering for Large Financial Systems

    Full text link
    As it is known in the finance risk and macroeconomics literature, risk-sharing in large portfolios may increase the probability of creation of default clusters and of systemic risk. We review recent developments on mathematical and computational tools for the quantification of such phenomena. Limiting analysis such as law of large numbers and central limit theorems allow to approximate the distribution in large systems and study quantities such as the loss distribution in large portfolios. Large deviations analysis allow us to study the tail of the loss distribution and to identify pathways to default clustering. Sensitivity analysis allows to understand the most likely ways in which different effects, such as contagion and systematic risks, combine to lead to large default rates. Such results could give useful insights into how to optimally safeguard against such events.Comment: in Large Deviations and Asymptotic Methods in Finance, (Editors: P. Friz, J. Gatheral, A. Gulisashvili, A. Jacqier, J. Teichmann) , Springer Proceedings in Mathematics and Statistics, Vol. 110 2015

    La incuestionabilidad del riesgo

    Get PDF
    Con anterioridad a la década de 1980, la literatura especializada en análisis y gestión del riesgo estaba dominada por la llamada visión tecnocrática o dominante. Esta visión establecía que los desastres naturales eran sucesos físicos extremos, producidos por una naturaleza caprichosa, externos a lo social y que requerían soluciones tecnológicas y de gestión por parte de expertos. Este artículo se centra en desarrollar una nueva explicación para entender la persistencia hegemónica de la visión tecnocrática basada en el concepto de incuestionabilidad del riesgo. Esta propuesta conceptual hace referencia a la incapacidad y desidia de los expertos, científicos y tomadores de decisiones en general (claimmakers) de identificar y actuar sobre las causas profundas de la producción del riesgo ya que ello conllevaría a cuestionar los imperativos normativos, las necesidades de las elites y los estilos de vida del actual sistema socioeconómico globalizado.Before de 1980s, the natural hazard analysis and management specialized literature was dominated by the so called "dominant" or "technocratic" view. Such perspective had established that natural disasters are extreme physical events caused by a whimsical nature and that these events are external to society. These events required technological and management solutions developed by experts. The current article aims at addressing a new explanatory component in the hegemonic persistence of the technocratic view. Such assumption was based on the "unquestionability of the risk" concept. It is stated that the "unquestionability of the risk" is the overall incapacity and neglect of experts, scientists and decision makers to identify and act over the deep causes of risk production, since it would make them question the normative imperatives and the demands from the elite as well as the life style in nowadays globalized socio-economic system

    Holder exponents of irregular signals and local fractional derivatives

    Full text link
    It has been recognized recently that fractional calculus is useful for handling scaling structures and processes. We begin this survey by pointing out the relevance of the subject to physical situations. Then the essential definitions and formulae from fractional calculus are summarized and their immediate use in the study of scaling in physical systems is given. This is followed by a brief summary of classical results. The main theme of the review rests on the notion of local fractional derivatives. There is a direct connection between local fractional differentiability properties and the dimensions/ local Holder exponents of nowhere differentiable functions. It is argued that local fractional derivatives provide a powerful tool to analyse the pointwise behaviour of irregular signals and functions.Comment: 20 pages, Late
    corecore