328 research outputs found

    A Bibliography on Fuzzy Automata, Grammars and Lanuages

    Get PDF
    This bibliography contains references to papers on fuzzy formal languages, the generation of fuzzy languages by means of fuzzy grammars, the recognition of fuzzy languages by fuzzy automata and machines, as well as some applications of fuzzy set theory to syntactic pattern recognition, linguistics and natural language processing

    Using mathematical programming to solve Factored Markov Decision Processes with Imprecise Probabilities

    Get PDF
    AbstractThis paper investigates Factored Markov Decision Processes with Imprecise Probabilities (MDPIPs); that is, Factored Markov Decision Processes (MDPs) where transition probabilities are imprecisely specified. We derive efficient approximate solutions for Factored MDPIPs based on mathematical programming. To do this, we extend previous linear programming approaches for linear approximations in Factored MDPs, resulting in a multilinear formulation for robust “maximin” linear approximations in Factored MDPIPs. By exploiting the factored structure in MDPIPs we are able to demonstrate orders of magnitude reduction in solution time over standard exact non-factored approaches, in exchange for relatively low approximation errors, on a difficult class of benchmark problems with millions of states

    Bayesian subset simulation

    Full text link
    We consider the problem of estimating a probability of failure α\alpha, defined as the volume of the excursion set of a function f:X⊆Rd→Rf:\mathbb{X} \subseteq \mathbb{R}^{d} \to \mathbb{R} above a given threshold, under a given probability measure on X\mathbb{X}. In this article, we combine the popular subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our sequential Bayesian approach for the estimation of a probability of failure (Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it possible to estimate α\alpha when the number of evaluations of ff is very limited and α\alpha is very small. The resulting algorithm is called Bayesian subset simulation (BSS). A key idea, as in the subset simulation algorithm, is to estimate the probabilities of a sequence of excursion sets of ff above intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A Gaussian process prior on ff is used to define the sequence of densities targeted by the SMC algorithm, and drive the selection of evaluation points of ff to estimate the intermediate probabilities. Adaptive procedures are proposed to determine the intermediate thresholds and the number of evaluations to be carried out at each stage of the algorithm. Numerical experiments illustrate that BSS achieves significant savings in the number of function evaluations with respect to other Monte Carlo approaches

    Disaster Management Cycle-Based Integrated Humanitarian Supply Network Management

    Get PDF
    While logistics research recently has placed increased focus on disruptionmanagement, few studies have examined the response and recovery phases in post-disaster operations. We present a multiple-objective, integrated network optimizationmodel for making strategic decisions in the supply distribution and network restorationphases of humanitarian logistics operations. Our model provides an equity- or fairness-based solution for constrained capacity, budget, and resource problems in post-disasterlogistics management. We then generate efficient Pareto frontiers to understand the trade-off between the objectives of interest.Next, we present a goal programming-based multiple-objective integratedresponse and recovery model. The model prescribes fairness-based compromise solutionsfor user-desired goals, given limited capacity, budget, and available resources. Anexperimental study demonstrates how different decision making strategies can beformulated to understand important dimensions of decision making.Considering multiple, conflicting objectives of the model, generating Pareto-optimal front with ample, diverse solutions quickly is important for a decision maker tomake a final decision. Thus, we adapt the well-known Non-dominated Sorting GeneticAlgorithm II (NSGA-II) by integrating an evolutionary heuristic with optimization-basedtechniques called the Hybrid NSGA-II for this NP-hard problem. A Hypervolume-basedtechnique is used to assess the algorithm’s effectiveness. The Hazards U.S. Multi-Hazard(Hazus)-generated regional case studies based on earthquake scenarios are used todemonstrate the applicability of our proposed models in post-disaster operations

    On Measure Quantifiers in First-Order Arithmetic

    Get PDF
    We study the logic obtained by endowing the language of first-order arithmetic with second-order measure quantifiers. This new kind of quantification allows us to express that the argument formula is true in a certain portion of all possible interpretations of the quantified variable. We show that first-order arithmetic with measure quantifiers is capable of formalizing simple results from probability theory and, most importantly, of representing every recursive random function. Moreover, we introduce a realizability interpretation of this logic in which programs have access to an oracle from the Cantor space

    An Exploratory Study of Patient Falls

    Get PDF
    Debate continues between the contribution of education level and clinical expertise in the nursing practice environment. Research suggests a link between Baccalaureate of Science in Nursing (BSN) nurses and positive patient outcomes such as lower mortality, decreased falls, and fewer medication errors. Purpose: To examine if there a negative correlation between patient falls and the level of nurse education at an urban hospital located in Midwest Illinois during the years 2010-2014? Methods: A retrospective crosssectional cohort analysis was conducted using data from the National Database of Nursing Quality Indicators (NDNQI) from the years 2010-2014. Sample: Inpatients aged ≥ 18 years who experienced a unintentional sudden descent, with or without injury that resulted in the patient striking the floor or object and occurred on inpatient nursing units. Results: The regression model was constructed with annual patient falls as the dependent variable and formal education and a log transformed variable for percentage of certified nurses as the independent variables. The model overall is a good fit, F (2,22) = 9.014, p = .001, adj. R2 = .40. Conclusion: Annual patient falls will decrease by increasing the number of nurses with baccalaureate degrees and/or certifications from a professional nursing board-governing body

    On Counting Propositional Logic and Wagner's Hierarchy

    Get PDF
    We introduce an extension of classical propositional logic with counting quantifiers. These forms of quantification make it possible to express that a formula is true in a certain portion of the set of all its interpretations. Beyond providing a sound and complete proof system for this logic, we show that validity problems for counting propositional logic can be used to capture counting complexity classes. More precisely, we show that the complexity of the decision problems for validity of prenex formulas of this logic perfectly match the appropriate levels of Wagner's counting hierarchy

    Conglomerate Multi-Fidelity Gaussian Process Modeling, with Application to Heavy-Ion Collisions

    Full text link
    In an era where scientific experimentation is often costly, multi-fidelity emulation provides a powerful tool for predictive scientific computing. While there has been notable work on multi-fidelity modeling, existing models do not incorporate an important ``conglomerate'' property of multi-fidelity simulators, where the accuracies of different simulator components (modeling separate physics) are controlled by different fidelity parameters. Such conglomerate simulators are widely encountered in complex nuclear physics and astrophysics applications. We thus propose a new CONglomerate multi-FIdelity Gaussian process (CONFIG) model, which embeds this conglomerate structure within a novel non-stationary covariance function. We show that the proposed CONFIG model can capture prior knowledge on the numerical convergence of conglomerate simulators, which allows for cost-efficient emulation of multi-fidelity systems. We demonstrate the improved predictive performance of CONFIG over state-of-the-art models in a suite of numerical experiments and two applications, the first for emulation of cantilever beam deflection and the second for emulating the evolution of the quark-gluon plasma, which was theorized to have filled the Universe shortly after the Big Bang
    • …
    corecore