414,486 research outputs found

    Temporal Landscapes: A Graphical Temporal Logic for Reasoning

    Full text link
    We present an elementary introduction to a new logic for reasoning about behaviors that occur over time. This logic is based on temporal type theory. The syntax of the logic is similar to the usual first-order logic; what differs is the notion of truth value. Instead of reasoning about whether formulas are true or false, our logic reasons about temporal landscapes. A temporal landscape may be thought of as representing the set of durations over which a statement is true. To help understand the practical implications of this approach, we give a wide variety of examples where this logic is used to reason about autonomous systems.Comment: 20 pages, lots of figure

    Fuzzy inequational logic

    Full text link
    We present a logic for reasoning about graded inequalities which generalizes the ordinary inequational logic used in universal algebra. The logic deals with atomic predicate formulas of the form of inequalities between terms and formalizes their semantic entailment and provability in graded setting which allows to draw partially true conclusions from partially true assumptions. We follow the Pavelka approach and define general degrees of semantic entailment and provability using complete residuated lattices as structures of truth degrees. We prove the logic is Pavelka-style complete. Furthermore, we present a logic for reasoning about graded if-then rules which is obtained as particular case of the general result

    Remarks on Inheritance Systems

    Full text link
    We try a conceptual analysis of inheritance diagrams, first in abstract terms, and then compare to "normality" and the "small/big sets" of preferential and related reasoning. The main ideas are about nodes as truth values and information sources, truth comparison by paths, accessibility or relevance of information by paths, relative normality, and prototypical reasoning

    Deconstructing climate misinformation to identify reasoning errors

    Get PDF
    Misinformation can have significant societal consequences. For example, misinformation about climate change has confused the public and stalled support for mitigation policies. When people lack the expertise and skill to evaluate the science behind a claim, they typically rely on heuristics such as substituting judgment about something complex (i.e. climate science) with judgment about something simple (i.e. the character of people who speak about climate science) and are therefore vulnerable to misleading information. Inoculation theory offers one approach to effectively neutralize the influence of misinformation. Typically, inoculations convey resistance by providing people with information that counters misinformation. In contrast, we propose inoculating against misinformation by explaining the fallacious reasoning within misleading denialist claims. We offer a strategy based on critical thinking methods to analyse and detect poor reasoning within denialist claims. This strategy includes detailing argument structure, determining the truth of the premises, and checking for validity, hidden premises, or ambiguous language. Focusing on argument structure also facilitates the identification of reasoning fallacies by locating them in the reasoning process. Because this reason-based form of inoculation is based on general critical thinking methods, it offers the distinct advantage of being accessible to those who lack expertise in climate science. We applied this approach to 42 common denialist claims and find that they all demonstrate fallacious reasoning and fail to refute the scientific consensus regarding anthropogenic global warming. This comprehensive deconstruction and refutation of the most common denialist claims about climate change is designed to act as a resource for communicators and educators who teach climate science and/or critical thinking

    Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    Get PDF
    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system

    Experimental philosophy and moral responsibility

    Get PDF
    Can experimental philosophy help us answer central questions about the nature of moral responsibility, such as the question of whether moral responsibility is compatible with determinism? Specifically, can folk judgments in line with a particular answer to that question provide support for that answer. Based on reasoning familiar from Condorcet’s Jury Theorem, such support could be had if individual judges track the truth of the matter independently and with some modest reliability: such reliability quickly aggregates as the number of judges goes up. In this chapter, however, I argue, partly based on empirical evidence, that although non-specialist judgments might on average be more likely than not to get things right, their individual likelihoods fail to aggregate because they do not track truth with sufficient independence

    Probabilities on Sentences in an Expressive Logic

    Get PDF
    Automated reasoning about uncertain knowledge has many applications. One difficulty when developing such systems is the lack of a completely satisfactory integration of logic and probability. We address this problem directly. Expressive languages like higher-order logic are ideally suited for representing and reasoning about structured knowledge. Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values. The main technical problem studied in this paper is the following: Given a set of sentences, each having some probability of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quantified hypotheses/sentences. We translate this wish-list into technical requirements for a prior probability and show that probabilities satisfying all our criteria exist. We also give explicit constructions and several general characterizations of probabilities that satisfy some or all of the criteria and various (counter) examples. We also derive necessary and sufficient conditions for extending beliefs about finitely many sentences to suitable probabilities over all sentences, and in particular least dogmatic or least biased ones. We conclude with a brief outlook on how the developed theory might be used and approximated in autonomous reasoning agents. Our theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic.Comment: 52 LaTeX pages, 64 definiton/theorems/etc, presented at conference Progic 2011 in New Yor
    corecore