4 research outputs found

    Spy Game: Verifying a Local Generic Solver in Iris

    Get PDF
    International audienceWe verify the partial correctness of a "local generic solver", that is, an on-demand, incremental, memoizing least fixed point computation algorithm. The verification is carried out in Iris, a modern breed of concurrent separation logic. The specification is simple: the solver computes the optimal least fixed point of a system of monotone equations. Although the solver relies on mutable internal state for memoization and for "spying", a form of dynamic dependency discovery, it is apparently pure: no side effects are mentioned in its specification. As auxiliary contributions, we provide several illustrations of the use of prophecy variables, a novel feature of Iris; we establish a restricted form of the infinitary conjunction rule; and we provide a specification and proof of Longley's modulus function, an archetypical example of spying

    Recursive program schemes: semantics and proof theory

    Get PDF

    Morphisms of ANN and the Computation of Least Fixed Points of Semantic Operators

    No full text

    MFCSIT 2004 Preliminary Version On the Integration of Connectionist and Logic-Based Systems

    No full text
    It is a long-standing and important problem to integrate logic-based systems and connectionist systems. In brief, this problem is concerned with how each of these two paradigms interacts with the other and how each complements the other: how one may give a logical interpretation of neural networks, how one may interpret connectionism within a logical framework, and how one may combine the advantages of each within a single integrated system. In this paper, the computation and approximate computation by neural networks of semantic operators TP determined by logic programs P is studied; the converse of this problem, namely, the extraction of logic programs from given neural networks is also briefly considered. The foundations of the relevant notions employed in this problem are revisited and clarified and new definitions are presented which avoid embedding spaces of interpretations in the real line. In particular, such definitions are formulated relating to (1) pointwise and uniform approximation of TP, and (2) approximation and computation of (least) fixed points of TP. There are related notions of approximation and convergence of neural networks, and related notions of approximation and convergence of programs and these are discussed briefly, although the focus here is on (1) and (2). Necessary and sufficient conditions for uniform approximation of TP by neural networks are given in terms of continuity. Finally, the class of programs for which these methods can be employed to compute fixed points is greatly extended from the rather small class of acyclic programs to the (computationally adequate) class of all definite programs
    corecore