74,041 research outputs found
Modalities, Cohesion, and Information Flow
It is informally understood that the purpose of modal type constructors in
programming calculi is to control the flow of information between types. In
order to lend rigorous support to this idea, we study the category of
classified sets, a variant of a denotational semantics for information flow
proposed by Abadi et al. We use classified sets to prove multiple
noninterference theorems for modalities of a monadic and comonadic flavour. The
common machinery behind our theorems stems from the the fact that classified
sets are a (weak) model of Lawvere's theory of axiomatic cohesion. In the
process, we show how cohesion can be used for reasoning about multi-modal
settings. This leads to the conclusion that cohesion is a particularly useful
setting for the study of both information flow, but also modalities in type
theory and programming languages at large
Structural damage detection based on cloud model and Dempster-Shafer evidence theory
Cloud model and D-S theory have been widely used in uncertainty reasoning. Meanwhile, modal strain energy and Inner Product Vector are also utilized as damage-sensitive features to detect structural damage. In this paper, a new structural damage identification approach is proposed based on Dempster-Shafer theory and cloud model. Cloud models were created to make uncertainty reasoning in damage structures by modal strain energy and the Inner Product Vector of acceleration. Then the results of the two methods were combined by using the Dempster-Shafer theory. Due to the classical D-S theory involves counter – intuitive behavious when the high conflicting evidences exists, the distance function was introduced to correct the conflict factor K and combine the evidences. Moreover, a model of simple beam was created to verify the feasibility and accuracy for the single-damage and the multiple-damage. The effects of noise on damage detection were investigated simultaneously. The results show that the method has strong anti-noise ability and high accuracy
Ticking clocks as dependent right adjoints: Denotational semantics for clocked type theory
Clocked Type Theory (CloTT) is a type theory for guarded recursion useful for
programming with coinductive types, allowing productivity to be encoded in
types, and for reasoning about advanced programming language features using an
abstract form of step-indexing. CloTT has previously been shown to enjoy a
number of syntactic properties including strong normalisation, canonicity and
decidability of the equational theory. In this paper we present a denotational
semantics for CloTT useful, e.g., for studying future extensions of CloTT with
constructions such as path types.
The main challenge for constructing this model is to model the notion of
ticks on a clock used in CloTT for coinductive reasoning about coinductive
types. We build on a category previously used to model guarded recursion with
multiple clocks. In this category there is an object of clocks but no object of
ticks, and so tick-assumptions in a context can not be modelled using standard
tools. Instead we model ticks using dependent right adjoint functors, a
generalisation of the category theoretic notion of adjunction to the setting of
categories with families. Dependent right adjoints are known to model
Fitch-style modal types, but in the case of CloTT, the modal operators
constitute a family indexed internally in the type theory by clocks. We model
this family using a dependent right adjoint on the slice category over the
object of clocks. Finally we show how to model the tick constant of CloTT using
a semantic substitution.
This work improves on a previous model by the first two named authors which
not only had a flaw but was also considerably more complicated.Comment: 31 pages. Second version is a minor revision. arXiv admin note: text
overlap with arXiv:1804.0668
Multimodal Analogical Reasoning over Knowledge Graphs
Analogical reasoning is fundamental to human cognition and holds an important
place in various fields. However, previous studies mainly focus on single-modal
analogical reasoning and ignore taking advantage of structure knowledge.
Notably, the research in cognitive psychology has demonstrated that information
from multimodal sources always brings more powerful cognitive transfer than
single modality sources. To this end, we introduce the new task of multimodal
analogical reasoning over knowledge graphs, which requires multimodal reasoning
ability with the help of background knowledge. Specifically, we construct a
Multimodal Analogical Reasoning dataSet (MARS) and a multimodal knowledge graph
MarKG. We evaluate with multimodal knowledge graph embedding and pre-trained
Transformer baselines, illustrating the potential challenges of the proposed
task. We further propose a novel model-agnostic Multimodal analogical reasoning
framework with Transformer (MarT) motivated by the structure mapping theory,
which can obtain better performance. Code and datasets are available in
https://github.com/zjunlp/MKG_Analogy.Comment: Accepted by ICLR 202
Epistemic Modals in Hypothetical Reasoning
Data involving epistemic modals suggest that some classically valid argument forms, such as reductio, are invalid in natural language reasoning as they lead to modal collapses. We adduce further data showing that the classical argument forms governing the existential quantifier are similarly defective, as they lead to a de re–de dicto collapse. We observe a similar problem for disjunction. But if the classical argument forms for negation, disjunction and existential quantification are invalid, what are the correct forms that govern the use of these items? Our diagnosis is that epistemic modals interfere with hypothetical reasoning. We present a modal first-order logic and model theory that characterizes hypothetical reasoning with epistemic modals in a principled manner. One upshot is a sound and complete natural deduction system for reasoning with epistemic modals in first-order logic.</p
Reasoning about Minimal Belief and Negation as Failure
We investigate the problem of reasoning in the propositional fragment of
MBNF, the logic of minimal belief and negation as failure introduced by
Lifschitz, which can be considered as a unifying framework for several
nonmonotonic formalisms, including default logic, autoepistemic logic,
circumscription, epistemic queries, and logic programming. We characterize the
complexity and provide algorithms for reasoning in propositional MBNF. In
particular, we show that entailment in propositional MBNF lies at the third
level of the polynomial hierarchy, hence it is harder than reasoning in all the
above mentioned propositional formalisms for nonmonotonic reasoning. We also
prove the exact correspondence between negation as failure in MBNF and negative
introspection in Moore's autoepistemic logic
Modal logics are coalgebraic
Applications of modal logics are abundant in computer science, and a large number of structurally different modal logics have been successfully employed in a diverse spectrum of application contexts. Coalgebraic semantics, on the other hand, provides a uniform and encompassing view on the large variety of specific logics used in particular domains. The coalgebraic approach is generic and compositional: tools and techniques simultaneously apply to a large class of application areas and can moreover be combined in a modular way. In particular, this facilitates a pick-and-choose approach to domain specific formalisms, applicable across the entire scope of application areas, leading to generic software tools that are easier to design, to implement, and to maintain. This paper substantiates the authors' firm belief that the systematic exploitation of the coalgebraic nature of modal logic will not only have impact on the field of modal logic itself but also lead to significant progress in a number of areas within computer science, such as knowledge representation and concurrency/mobility
Designing Normative Theories for Ethical and Legal Reasoning: LogiKEy Framework, Methodology, and Tool Support
A framework and methodology---termed LogiKEy---for the design and engineering
of ethical reasoners, normative theories and deontic logics is presented. The
overall motivation is the development of suitable means for the control and
governance of intelligent autonomous systems. LogiKEy's unifying formal
framework is based on semantical embeddings of deontic logics, logic
combinations and ethico-legal domain theories in expressive classic
higher-order logic (HOL). This meta-logical approach enables the provision of
powerful tool support in LogiKEy: off-the-shelf theorem provers and model
finders for HOL are assisting the LogiKEy designer of ethical intelligent
agents to flexibly experiment with underlying logics and their combinations,
with ethico-legal domain theories, and with concrete examples---all at the same
time. Continuous improvements of these off-the-shelf provers, without further
ado, leverage the reasoning performance in LogiKEy. Case studies, in which the
LogiKEy framework and methodology has been applied and tested, give evidence
that HOL's undecidability often does not hinder efficient experimentation.Comment: 50 pages; 10 figure
- …