421 research outputs found

    Graph easy sets of mute lambda terms

    Get PDF
    Among the unsolvable terms of the lambda calculus, the mute ones are those having the highest degree of undefinedness. In this paper, we define for each natural number n, an infinite and recursive set M-n of mute terms, and show that it is graph-easy: for any closed term t of the lambda calculus there exists a graph model equating all the terms of M-n to t. Alongside, we provide a brief survey of the notion of undefinedness in the lambda calculus. (C) 2015 Elsevier B.V. All rights reserved

    On the safety of NĂścker's strictness analysis

    Get PDF
    This paper proves correctness of Nocker s method of strictness analysis, implemented for Clean, which is an e ective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the abstract reduction rules. Our method also addresses the cycle detection rules, which are the main strength of Nocker s strictness analysis. We reformulate Nocker s strictness analysis algorithm in a higherorder lambda-calculus with case, constructors, letrec, and a nondeterministic choice operator used as a union operator. Furthermore, the calculus is expressive enough to represent abstract constants like Top or Inf. The operational semantics is a small-step semantics and equality of expressions is defined by a contextual semantics that observes termination of expressions. The correctness of several reductions is proved using a context lemma and complete sets of forking and commuting diagrams. The proof is based mainly on an exact analysis of the lengths of normal order reductions. However, there remains a small gap: Currently, the proof for correctness of strictness analysis requires the conjecture that our behavioral preorder is contained in the contextual preorder. The proof is valid without referring to the conjecture, if no abstract constants are used in the analysis

    A complete proof of the safety of NĂścker's strictness analysis

    Get PDF
    This paper proves correctness of NĂścker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of NĂścker's strictness analysis. Our algorithm SAL is a reformulation of NĂścker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of NĂścker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for NĂścker's strictness analysis in Clean, and also for its use in Haskell

    From Proof Nets to the Free *-Autonomous Category

    Get PDF
    In the first part of this paper we present a theory of proof nets for full multiplicative linear logic, including the two units. It naturally extends the well-known theory of unit-free multiplicative proof nets. A linking is no longer a set of axiom links but a tree in which the axiom links are subtrees. These trees will be identified according to an equivalence relation based on a simple form of graph rewriting. We show the standard results of sequentialization and strong normalization of cut elimination. In the second part of the paper we show that the identifications enforced on proofs are such that the class of two-conclusion proof nets defines the free *-autonomous category.Comment: LaTeX, 44 pages, final version for LMCS; v2: updated bibliograph

    Polynomial functors and polynomial monads

    Full text link
    We study polynomial functors over locally cartesian closed categories. After setting up the basic theory, we show how polynomial functors assemble into a double category, in fact a framed bicategory. We show that the free monad on a polynomial endofunctor is polynomial. The relationship with operads and other related notions is explored.Comment: 41 pages, latex, 2 ps figures generated at runtime by the texdraw package (does not compile with pdflatex). v2: removed assumptions on sums, added short discussion of generalisation, and more details on tensorial strength

    A Functorial Bridge between the Infinitary Affine Lambda-Calculus and Linear Logic

    No full text
    International audienceIt is a well known intuition that the exponential modality of linear logic may be seen as a form of limit. Recently,MellĂŹ es, Tabareau and Tasson gave a categorical account for this intuition, whereas the first author provided a topological account, based on an infinitary syntax. We relate these two different views by giving a categorical version of the topological construction, yielding two benefits: on the one hand, we obtain canonical models of the infinitary affine lambda-calculus introduced by the first author; on the other hand, we find an alternative formula for computing free commutative comonoids in models of linear logic with respect to the one presented byMellĂŹ es et al

    Finding the Needle in a Haystack: On the Automatic Identification of Accessibility User Reviews

    Get PDF
    In recent years, mobile accessibility has become an important trend with the goal of allowing all users the possibility of using any app without many limitations. User reviews include insights that are useful for app evolution. However, with the increase in the amount of received reviews, manually analyzing them is tedious and time-consuming, especially when searching for accessibility reviews. The goal of this paper is to support the automated identification of accessibility in user reviews, to help technology professionals in prioritizing their handling, and thus, creating more inclusive apps. Particularly, we design a model that takes as input accessibility user reviews, learns their keyword-based features, in order to make a binary decision, for a given review, on whether it is about accessibility or not. The model is evaluated using a total of 5,326 mobile app reviews. The findings show that (1) our model can accurately identify accessibility reviews, outperforming two baselines, namely keyword-based detector and a random classifier; (2) our model achieves an accuracy of 85% with relatively small training dataset; however, the accuracy improves as we increase the size of the training dataset

    Constructive formal methods and protocol standardization

    Get PDF
    This research is part of the NWO project "Improving the Quality of Protocol Standards". In this project we have cooperated with industrial standardization committees that are developing protocol standards. Thus we have contributed to these international standards, and we have generated relevant research questions in the field of formal methods. The first part of this thesis is related to the ISO/IEEE 1073.2 standard, which addresses medical device communication. The protocols in this standard were developed from a couple of MSC scenarios that describe typical intended behavior. Upon synthesizing a protocol from such scenarios, interference between these scenarios may be introduced, which leads to undesired behaviors. This is called the realizability problem. To address the realizability problem, we have introduced a formal framework that is based on partial orders. In this way the problem that causes the interference can be clearly pointed out. We have provided a complete characterization of realizability criteria that can be used to determine whether interference problems are to be expected. Moreover, we have provided a new constructive approach to solve the undesired interference in practical situations. These techniques have been used to improve the protocol standard under consideration. The second part of this thesis is related to the IEEE 1394.1-2004 standard, which addresses High Performance Serial Bus Bridges. This is an extension of the IEEE 1394-1995 standard, also known as FireWire. The development of the distributed spanning tree algorithm turned out to be a serious problem. To address this problem, we have first developed and proposed a much simpler algorithm. We have also studied the algorithm proposed by the developers of the standard, namely by formally reconstructing a version of it, starting from the specification. Such a constructive approach to verification and analysis uses mathematical techniques, or formal methods, to reveal the essential mechanisms that play a role in the algorithm. We have shown the need for different levels of abstraction, and we have illustrated that the algorithm is in fact distributed at two levels. These techniques are usually applied manually, but we have also developed an approach to automate parts of it using state-of-the-art theorem provers

    Efficient Collection and Processing of Cyber Threat Intelligence from Partner Feeds

    Get PDF
    Sharing of threat intelligence between organizations and companies in the cyber security industry is a crucial part of proactive defense against security threats. Even though some standardization efforts exist, most publishers of cyber security feeds use their own approach and provide data in varying formats, schemata, compression algorithms, through differing APIs etc. This makes every feed unique and complicates their automated collection and processing.Furthermore, the published data may contain a lot of irrelevant records, such as duplicates or data about very exotic files or websites, which are not useful. In this work, we present Feed Automation, a cloud-based system for fully automatic collection and processing of cyber threat intelligence from a variety of online feeds. The system provides two means for reduction of noise in the data: a smart deduplication service based on a sliding window technique, which is able to remove just the duplicates with no important changes in the metadata; and efficient rules, easily configurable by the malware analysts, to remove records, which are not useful for us. Additionally, we propose a filtering solution based on machine learning, which is able to predict how useful a record is for our backend systems based on historic data. We demonstrate how this system can help to unify the feed collection, processing, and data noise reduction into one automated system, speeding up development, simplifying maintenance, and reducing the load for the backend systems
    • …
    corecore