17,984 research outputs found

    A distributionally robust perspective on uncertainty quantification and chance constrained programming

    Get PDF
    The objective of uncertainty quantification is to certify that a given physical, engineering or economic system satisfies multiple safety conditions with high probability. A more ambitious goal is to actively influence the system so as to guarantee and maintain its safety, a scenario which can be modeled through a chance constrained program. In this paper we assume that the parameters of the system are governed by an ambiguous distribution that is only known to belong to an ambiguity set characterized through generalized moment bounds and structural properties such as symmetry, unimodality or independence patterns. We delineate the watershed between tractability and intractability in ambiguity-averse uncertainty quantification and chance constrained programming. Using tools from distributionally robust optimization, we derive explicit conic reformulations for tractable problem classes and suggest efficiently computable conservative approximations for intractable ones

    Distributionally Robust Optimization for Sequential Decision Making

    Full text link
    The distributionally robust Markov Decision Process (MDP) approach asks for a distributionally robust policy that achieves the maximal expected total reward under the most adversarial distribution of uncertain parameters. In this paper, we study distributionally robust MDPs where ambiguity sets for the uncertain parameters are of a format that can easily incorporate in its description the uncertainty's generalized moment as well as statistical distance information. In this way, we generalize existing works on distributionally robust MDP with generalized-moment-based and statistical-distance-based ambiguity sets to incorporate information from the former class such as moments and dispersions to the latter class that critically depends on empirical observations of the uncertain parameters. We show that, under this format of ambiguity sets, the resulting distributionally robust MDP remains tractable under mild technical conditions. To be more specific, a distributionally robust policy can be constructed by solving a sequence of one-stage convex optimization subproblems

    Dark Matter Constraints from a Joint Analysis of Dwarf Spheroidal Galaxy Observations with VERITAS

    Full text link
    We present constraints on the annihilation cross section of WIMP dark matter based on the joint statistical analysis of four dwarf galaxies with VERITAS. These results are derived from an optimized photon weighting statistical technique that improves on standard imaging atmospheric Cherenkov telescope (IACT) analyses by utilizing the spectral and spatial properties of individual photon events. We report on the results of \sim230 hours of observations of five dwarf galaxies and the joint statistical analysis of four of the dwarf galaxies. We find no evidence of gamma-ray emission from any individual dwarf nor in the joint analysis. The derived upper limit on the dark matter annihilation cross section from the joint analysis is 1.35×1023cm3s11.35\times 10^{-23} {\mathrm{ cm^3s^{-1}}} at 1 TeV for the bottom quark (bbˉb\bar{b}) final state, 2.85×1024cm3s12.85\times 10^{-24}{\mathrm{ cm^3s^{-1}}} at 1 TeV for the tau lepton (τ+τ\tau^{+}\tau^{-}) final state and 1.32×1025cm3s11.32\times 10^{-25}{\mathrm{ cm^3s^{-1}}} at 1 TeV for the gauge boson (γγ\gamma\gamma) final state.Comment: 14 pages, 9 figures, published in PRD, Ascii tables containing annihilation cross sections limits are available for download as ancillary files with readme.txt file description of limit

    Data-Driven Chance Constrained Programs over Wasserstein Balls

    Full text link
    We provide an exact deterministic reformulation for data-driven chance constrained programs over Wasserstein balls. For individual chance constraints as well as joint chance constraints with right-hand side uncertainty, our reformulation amounts to a mixed-integer conic program. In the special case of a Wasserstein ball with the 11-norm or the \infty-norm, the cone is the nonnegative orthant, and the chance constrained program can be reformulated as a mixed-integer linear program. Our reformulation compares favourably to several state-of-the-art data-driven optimization schemes in our numerical experiments.Comment: 25 pages, 9 figure

    The evolution of auditory contrast

    Get PDF
    This paper reconciles the standpoint that language users do not aim at improving their sound systems with the observation that languages seem to improve their sound systems. Computer simulations of inventories of sibilants show that Optimality-Theoretic learners who optimize their perception grammars automatically introduce a so-called prototype effect, i.e. the phenomenon that the learner’s preferred auditory realization of a certain phonological category is more peripheral than the average auditory realization of this category in her language environment. In production, however, this prototype effect is counteracted by an articulatory effect that limits the auditory form to something that is not too difficult to pronounce. If the prototype effect and the articulatory effect are of a different size, the learner must end up with an auditorily different sound system from that of her language environment. The computer simulations show that, independently of the initial auditory sound system, a stable equilibrium is reached within a small number of generations. In this stable state, the dispersion of the sibilants of the language strikes an optimal balance between articulatory ease and auditory contrast. The important point is that this is derived within a model without any goal-oriented elements such as dispersion constraints

    Reducing conservatism in robust optimization

    Get PDF

    The Quasar Pair Q 1634+267 A, B and the Binary QSO vs. Dark Lens Hypotheses

    Get PDF
    Deep HST/NICMOS H (F160W) band observations of the z=1.96 quasar pair Q 1634+267A,B reveal no signs of a lens galaxy to a 1 sigma threshold of approximately 22.5 mag. The minimum luminosity for a normal lens galaxy would be a 6L_* galaxy at z > 0.5, which is 650 times greater than our detection threshold. Our observation constrains the infrared mass-to-light ratio of any putative, early-type, lens galaxy to (M/L)_H > 690h_65 (1200h_65) for Omega_0=0.1 (1.0) and H_0=65h_65 km/s/Mpc. We would expect to detect a galaxy somewhere in the field because of the very strong Mg II absorption lines at z=1.1262 in the Q 1634+267 A spectrum, but the HST H-band, I-band (F785LP) and V-band (F555W) images require that any associated galaxy be very under-luminous less than 0.1 L^*_H (1.0 L^*_I) if it lies within less than 40 h^{-1} (100 h^{-1}) kpc from Q 1634+267 A,B. While the large image separation (3.86 arcsec) and the lack of a lens galaxy strongly favor interpreting Q 1634+267A,B as a binary quasar system, the spectral similarity remains a puzzle. We estimate that at most 0.06% of randomly selected quasar pairs would have spectra as similar to each other as the spectra of Q 1634+267 A and B. Moreover, spectral similarities observed for the 14 quasar pairs are significantly greater than would be expected for an equivalent sample of randomly selected field quasars. Depending on how strictly we define similarity, we estimate that only 0.01--3% of randomly drawn samples of 14 quasar pairs would have as many similar pairs as the observational sample.Comment: 24 pages, including 4 figures, LaTex, ApJ accepted, comments from the editor included, minor editorial change
    corecore