581 research outputs found

    Multi-player games with LDL goals over finite traces

    Get PDF
    Linear Dynamic Logic on finite traces (LDLF) is a powerful logic for reasoning about the behaviour of concurrent and multi-agent systems. In this paper, we investigate techniques for both the characterisation and verification of equilibria in multi-player games with goals/objectives expressed using logics based on LDLF. This study builds upon a generalisation of Boolean games, a logic-based game model of multi-agent systems where players have goals succinctly represented in a logical way. Because LDLF goals are considered, in the settings we study—Reactive Modules games and iterated Boolean games with goals over finite traces—players' goals can be defined to be regular properties while achieved in a finite, but arbitrarily large, trace. In particular, using alternating automata, the paper investigates automata-theoretic approaches to the characterisation and verification of (pure strategy Nash) equilibria, shows that the set of Nash equilibria in multi-player games with LDLF objectives is regular, and provides complexity results for the associated automata constructions

    Synthesis with rational environments

    Get PDF
    Synthesis is the automated construction of a system from its specification. The system has to satisfy its specification in all possible environments. The environment often consists of agents that have objectives of their own. Thus, it makes sense to soften the universal quantification on the behavior of the environment and take the objectives of its underlying agents into an account. Fisman et al. introduced rational synthesis: the problem of synthesis in the context of rational agents. The input to the problem consists of temporal logic formulas specifying the objectives of the system and the agents that constitute the environment, and a solution concept (e.g., Nash equilibrium). The output is a profile of strategies, for the system and the agents, such that the objective of the system is satisfied in the computation that is the outcome of the strategies, and the profile is stable according to the solution concept; that is, the agents that constitute the environment have no incentive to deviate from the strategies suggested to them. In this paper we continue to study rational synthesis. First, we suggest an alternative definition to rational synthesis, in which the agents are rational but not cooperative. We call such problem strong rational synthesis. In the strong rational synthesis setting, one cannot assume that the agents that constitute the environment take into account the strategies suggested to them. Accordingly, the output is a strategy for the system only, and the objective of the system has to be satisfied in all the compositions that are the outcome of a stable profile in which the system follows this strategy. We show that strong rational synthesis is 2ExpTime-complete, thus it is not more complex than traditional synthesis or rational synthesis. Second, we study a richer specification formalism, where the objectives of the system and the agents are not Boolean but quantitative. In this setting, the objective of the system and the agents is to maximize their outcome. The quantitative setting significantly extends the scope of rational synthesis, making the game-theoretic approach much more relevant. Finally, we enrich the setting to one that allows coalitions of agents that constitute the system or the environment

    Nash equilibrium and bisimulation invariance

    Get PDF
    Game theory provides a well-established framework for the analysis of concurrent and multi-agent systems. The basic idea is that concurrent processes (agents) can be understood as corresponding to players in a game; plays represent the possible computation runs of the system; and strategies define the behaviour of agents. Typically, strategies are modelled as functions from sequences of system states to player actions. Analysing a system in such a setting involves computing the set of (Nash) equilibria in the concurrent game. However, we show that, with respect to the above model of strategies (arguably, the "standard" model in the computer science literature), bisimilarity does not preserve the existence of Nash equilibria. Thus, two concurrent games which are behaviourally equivalent from a semantic perspective, and which from a logical perspective satisfy the same temporal logic formulae, may nevertheless have fundamentally different properties (solutions) from a game theoretic perspective. Our aim in this paper is to explore the issues raised by this discovery. After illustrating the issue by way of a motivating example, we present three models of strategies with respect to which the existence of Nash equilibria is preserved under bisimilarity. We use some of these models of strategies to provide new semantic foundations for logics for strategic reasoning, and investigate restricted scenarios where bisimilarity can be shown to preserve the existence of Nash equilibria with respect to the conventional model of strategies in the computer science literature

    Advantages of the recursive operability analysis in updating the risk assessment

    Get PDF
    With the introduction of new regulations and sustainable technologies, revamping and upgrading already existing chemical plants is nowadays an important element in the framework of process engineering. Such important modifications must come along in parallel improvement of process safety. In this sense, risk assessment is a tool that should be versatile and easy to update by definition. However, even the most common methods currently used for accidental scenarios identification and risk assessment estimation (such as HazOp) may prove to be very time-consuming when discussing about safety from process modifications. The availability of a reliable and easy-to-update tool for safety engineering is crucial for process industries. In this work, we compare a risk analysis on a chemical plant subject of modifications performed with two different tools: HazOp and FTA vs Recursive Operability Analysis (ROA) and FTA. Both techniques have been applied to a tank dedicated to dust mixing that was subject of process modifications. Both methods come to the same conclusions, highlighting new failures and process criticalities, associated with the introduction of flow alarms and interlocks in case of excessive depressurizing. It is shown that the Recursive Operability Analysis, with its cause-consequence structure tied with process variable interactions, is much more effective in a risk assessment update

    Dysmenorrhea and related disorders

    Get PDF
    Dysmenorrhea is a common symptom secondary to various gynecological disorders, but it is also represented in most women as a primary form of disease. Pain associated with dysmenorrhea is caused by hypersecretion of prostaglandins and an increased uterine contractility. The primary dysmenorrhea is quite frequent in young women and remains with a good prognosis, even though it is associated with low quality of life. The secondary forms of dysmenorrhea are associated with endometriosis and adenomyosis and may represent the key symptom. The diagnosis is suspected on the basis of the clinical history and the physical examination and can be confirmed by ultrasound, which is very useful to exclude some secondary causes of dysmenorrhea, such as endometriosis and adenomyosis. The treatment options include non-steroidal anti-inflammatory drugs alone or combined with oral contraceptives or progestins

    Hazardous Area Classification Due to Combustible Dust Atmospheres and Layers: Avoiding Common Mistakes

    Get PDF
    In areas and workplaces where combustible dusts are produced, handled or stored, Hazardous Area Classification is required to assess the likelihood of formation of a dust explosive atmosphere. The resulting dust zoning is of paramount importance in deciding the type and protection modes of electrical and non-electrical apparatuses to install in those areas. Dust zoning is a widespread and well-known technique, covered by dedicated technical Standards such as IEC 60079-10-2 and NFPA 499. As such, it also represents the first step for a dust explosion risk assessment, therefore its quality and completeness are of the utmost importance in order to achieve a high-value, robust explosion risk management. The behavior of fires and explosions from dust clouds or layers is strictly dependent on the chemical-physical characteristics of the dust: the first section of this paper shall analyze those characteristics, and how their variations affect the classification of areas. After this overview, the paper shall illustrate the most common misconceptions and mistakes that can be encountered in Hazardous Area Classifications and provide insights and suggestions on how to avoid them

    Prompt gamma activation studies on archaeological objects at a pulsed neutron source

    Get PDF
    The potential of Prompt Gamma Activation Analysis (PGAA) for non-destructive quantitative investigation of archaeological objects at a pulsed neutron spallation source was studied. Experiments were performed on the ROTAX time-of-flight diffractometer of the ISIS neutron source on a chalcolithic copper axe, a limestone sample from the ancient Quarry of Masarah (Egypt), a Roman bronze fibula and two fragments of glass from the Roman Villa Adriana. For reference and comparison, measurements were also performed at the PGAA station of the Budapest research reactor. It is found that the performance of a PGAA analysis at a pulsed source, with a make-shift set-up on an instrument designed for diffraction studies, cannot match the achievable results at a dedicated PGAA facility at a reactor source. However, the possibility of performing different investigations, e.g., neutron diffraction for structure analysis and PGAA for elemental analysis, at a single facility on one and the same object remains attractive and offers useful applications in the field of cultural heritage
    • …
    corecore