7 research outputs found

    Inference Rules in Nelson’s Logics, Admissibility and Weak Admissibility

    Get PDF
    © 2015, Springer Basel. Our paper aims to investigate inference rules for Nelson’s logics and to discuss possible ways to determine admissibility of inference rules in such logics. We will use the technique offered originally for intuitionistic logic and paraconsistent minimal Johannson’s logic. However, the adaptation is not an easy and evident task since Nelson’s logics do not enjoy replacement of equivalences rule. Therefore we consider and compare standard admissibility and weak admissibility. Our paper founds algorithms for recognizing weak admissibility and admissibility itself – for restricted cases, to show the problems arising in the course of study

    States in flux: logics of change, dynamic semantics, and dialogue

    Get PDF

    Intuitionism and logical revision.

    Get PDF
    The topic of this thesis is logical revision: should we revise the canons of classical reasoning in favour of a weaker logic, such as intuitionistic logic? In the first part of the thesis, I consider two metaphysical arguments against the classical Law of Excluded Middle-arguments whose main premise is the metaphysical claim that truth is knowable. I argue that the first argument, the Basic Revisionary Argument, validates a parallel argument for a conclusion that is unwelcome to classicists and intuitionists alike: that the dual of the Law of Excluded Middle, the Law of Non-Contradiction, is either unknown, or both known and not known to be true. As for the second argument, the Paradox of Knowability, I offer new reasons for thinking that adopting intuitionistic logic does not go to the heart of the matter. In the second part of the thesis, I motivate an inferentialist framework for assessing competing logics-one on which the meaning of the logical vocabulary is determined by the rules for its correct use. I defend the inferentialist account of understanding from the contention that it is inadequate in principle, and I offer reasons for thinking that the inferentialist approach to logic can help model theorists and proof-theorists alike justify their logical choices. I then scrutinize the main meaning-theoretic principles on which the inferentialist approach to logic rests: the requirements of harmony and separability. I show that these principles are motivated by the assumption that inference rules are complete, and that the kind of completeness that is necessary for imposing separability is strictly stronger than the completeness needed for requiring harmony. This allows me to reconcile the inferentialist assumption that inference rules are complete with the inherent incompleteness of higher-order logics-an apparent tension that has sometimes been thought to undermine the entire inferentialist project. I finally turn to the question whether the inferentialist framework is inhospitable in principle to classical logical principles. I compare three different regimentations of classical logic: two old, the multiple-conclusions and the bilateralist ones, and one new. Each of them satisfies the requirements of harmony and separability, but each of them also invokes structural principles that are not accepted by the intuitionist logician. I offer reasons for dismissing multiple-conclusions and bilateralist formalizations of logic, and I argue that we can nevertheless be in harmony with classical logic, if we are prepared to adopt classical rules for disjunction, and if we are willing to treat absurdity as a logical punctuation sign

    LDS - Labelled Deductive Systems: Volume 1 - Foundations

    No full text
    Traditional logics manipulate formulas. The message of this book is to manipulate pairs; formulas and labels. The labels annotate the formulas. This sounds very simple but it turned out to be a big step, which makes a serious difference, like the difference between using one hand only or allowing for the coordinated use of two hands. Of course the idea has to be made precise, and its advantages and limitations clearly demonstrated. `Precise' means a good mathematical definition and `advantages demonstrated' means case studies and applications in pure logic and in AI. To achieve that we need to address the following: \begin{enumerate} \item Define the notion of {\em LDS}, its proof theory and semantics and relate it to traditional logics. \item Explain what form the traditional concepts of cut elimination, deduction theorem, negation, inconsistency, update, etc.\ take in {\em LDS}. \item Formulate major known logics in {\em LDS}. For example, modal and temporal logics, substructural logics, default, nonmonotonic logics, etc. \item Show new results and solve long-standing problems using {\em LDS}. \item Demonstrate practical applications. \end{enumerate} This is what I am trying to do in this book. Part I of the book is an intuitive presentation of {\em LDS} in the context of traditional current views of monotonic and nonmonotonic logics. It is less oriented towards the pure logician and more towards the practical consumer of logic. It has two tasks, addressed in two chapters. These are: \begin{itemlist}{Chapter 1:} \item [Chapter1:] Formally motivate {\em LDS} by starting from the traditional notion of `What is a logical system' and slowly adding features to it until it becomes essentially an {\em LDS}. \item [Chapter 2:] Intuitively motivate {\em LDS} by showing many examples where labels are used, as well as some case studies of familiar logics (e.g.\ modal logic) formulated as an {\em LDS}. \end{itemlist} The second part of the book presents the formal theory of {\em LDS} for the formal logician. I have tried to avoid the style of definition-lemma-theorem and put in some explanations. What is basically needed here is the formulation of the mathematical machinery capable of doing the following. \begin{itemize} \item Define {\em LDS} algebra, proof theory and semantics. \item Show how an arbitrary (or fairly general) logic, presented traditionally, say as a Hilbert system or as a Gentzen system, can be turned into an {\em LDS} formulation. \item Show how to obtain a traditional formulations (e.g.\ Hilbert) for an arbitrary {\em LDS} presented logic. \item Define and study major logical concepts intrinsic to {\em LDS} formalisms. \item Give detailed study of the {\em LDS} formulation of some major known logics (e.g.\ modal logics, resource logics) and demonstrate its advantages. \item Translate {\em LDS} into classical logic (reduce the `new' to the `old'), and explain {\em LDS} in the context of classical logic (two sorted logic, metalevel aspects, etc). \end{itemize} \begin{itemlist}{Chapter 1:} \item [Chapter 3:] Give fairly general definitions of some basic concepts of {\em LDS} theory, mainly to cater for the needs of the practical consumer of logic who may wish to apply it, with a detailed study of the metabox system. The presentation of Chapter 3 is a bit tricky. It may be too formal for the intuitive reader, but not sufficiently clear and elegant for the mathematical logician. I would be very grateful for comments from the readers for the next draft. \item [Chapter 4:] Presents the basic notions of algebraic {\em LDS}. The reader may wonder how come we introduce algebraic {\em LDS} in chapter 3 and then again in chapter 4. Our aim in chapter 3 is to give a general definition and formal machinery for the applied consumer of logic. Chapter 4 on the other hand studies {\em LDS} as formal logics. It turns out that to formulate an arbitrary logic as an {\em LDS} one needs some specific labelling algebras and these need to be studied in detail (chapter 4). For general applications it is more convenient to have general labelling algebras and possibly mathematically redundant formulations (chapter 3). In a sense chapter 4 continues the topic of the second section of chapter 3. \item [Chapter 5:] Present the full theory of {\em LDS} where labels can be databases from possibly another {\em LDS}. It also presents Fibred Semantics for {\em LDS}. \item [Chapter 6:] Presents a theory of quantifers for {\em LDS}. The material for this chapter is still under research. \item [Chapter 7:] Studies structured consequence relations. These are logical system swhere the structure is not described through labels but through some geometry like lists, multisets, trees, etc. Thus the label of a wff AA is implicit, given by the place of AA in the structure. \item [Chapter 8:] Deals with metalevel features of {\em LDS} and its translation into two sorted classical logic. \end{itemlist} Parts 3 and 4 of the book deals in detail with some specific families of logics. Chapters 9--11 essentailly deal with substructural logics and their variants. \begin{itemlist}{Chapter10:} \item [Chapter 9:] Studies resource and substructural logics in general. \item [Chapter 10:] Develops detailed proof theory for some systems as well as studying particular features such as negation. \item [Chapter 11:] Deals with many valued logics. \item [Chapter 12:] Studies the Curry Howard formula as type view and how it compres with labelling. \item [Chapter 13:] Deals with modal and temporal logics. \end{itemlist} Part 5 of the book deals with {\em LDS} metatheory. \begin{itemlist}{Chapter15:} \item [Chapter 14:] Deals with labelled tableaux. \item [Chapter 15:] Deals with combining logics. \item [Chapter 16:] Deals with abduction. \end{itemlist

    Modeling epistemic propositions

    Get PDF

    Modeling epistemic propositions

    Get PDF

    Shortest Route at Dynamic Location with Node Combination-Dijkstra Algorithm

    Get PDF
    Abstract— Online transportation has become a basic requirement of the general public in support of all activities to go to work, school or vacation to the sights. Public transportation services compete to provide the best service so that consumers feel comfortable using the services offered, so that all activities are noticed, one of them is the search for the shortest route in picking the buyer or delivering to the destination. Node Combination method can minimize memory usage and this methode is more optimal when compared to A* and Ant Colony in the shortest route search like Dijkstra algorithm, but can’t store the history node that has been passed. Therefore, using node combination algorithm is very good in searching the shortest distance is not the shortest route. This paper is structured to modify the node combination algorithm to solve the problem of finding the shortest route at the dynamic location obtained from the transport fleet by displaying the nodes that have the shortest distance and will be implemented in the geographic information system in the form of map to facilitate the use of the system. Keywords— Shortest Path, Algorithm Dijkstra, Node Combination, Dynamic Location (key words
    corecore