2,061 research outputs found

    Logical Specification and Analysis of Fault Tolerant Systems through Partial Model Checking

    Get PDF
    This paper presents a framework for a logical characterisation of fault tolerance and its formal analysis based on partial model checking techniques. The framework requires a fault tolerant system to be modelled using a formal calculus, here the CCS process algebra. To this aim we propose a uniform modelling scheme in which to specify a formal model of the system, its failing behaviour and possibly its fault-recovering procedures. Once a formal model is provided into our scheme, fault tolerance - with respect to a given property - can be formalized as an equational ”-calculus formula. This formula expresses in a logic formalism, all the fault scenarios satisfying that fault tolerance property. Such a characterisation understands the analysis of fault tolerance as a form of analysis of open systems and thank to partial model checking strategies, it can be made independent on any particular fault assumption. Moreover this logical characterisation makes possible the fault-tolerance verification problem be expressed as a general ”-calculus validation problem, for solving which many theorem proof techniques and tools are available. We present several analysis methods showing the flexibility of our approach

    Formal Intergovernmental Alliances in the European Union: Disappearing or Still Alive?

    Get PDF
    The leading opinion-making newspaper The Economist suggested during the time of the Constitutional Treaty negotiations that ‘there are no more fixed and reliable alliances in the EU. Countries team up with each other, depending on issue and circumstances’ (The Economist, February 6, 2003: 3). This was a daring suggestion in view of the history of long-term strategic relationships within Europe, especially the Franco-German and the Benelux, which have in the past played leadership role in the establishment and progress of European integration. Former Dutch Minister of Foreign Affairs, Jaap de Hoop Scheffer, also commented that the Constitutional Treaty negotiations have shown a ‚renaissance of bilateralism‛ in the new Europe: ‘With each new issue we are likely to see changing ad hoc coalitions of member states’ (De Hoop Scheffer 2003: 1). Similarly, Lord Kerr in his address at the Center for European Studies, Harvard University (11 July 2003) suggested that ‘*a+lliances [were] increasingly a matter of convenience; we can expect more of a wide-spread promiscuity among member states’. Do such assertions stand up to scholarly investigation? Is there any empirical evidence to suggest that the existing formal alliances in Europe are disappearing? Analysing the case of the Visegrád Group this paper answers negatively. It argues that the strength of cooperation within formal alliances is not to be evaluated based on their coalitional cooperation in the end games of EU negotiations, which tend to attract most popular attention. Rather, the questions of viability of formal alliances need to shift from the end-game of EU negotiations to the day-to-day interactions between the lower-end of the government hierarchy, i.e. the government representatives at the technical and lower political level – this is where the vast majority of EU policy agenda is set and majority of policy formulations are agreed upon in the pre-negotiations within the Council working groups. In view of these findings, the paper suggests that the prominent account of ‘two-level games’ by Putnam (1988) which has influenced most of the recent literature on EU negotiations might need to be revised to take into account the ‚third-level‛ negotiations within formal alliances.. The argument introduced is that next to the domestic constituencies and EU-level negotiations, as depicted by Putnam (1988), governments involved in formal alliances also simultaneously negotiate with their alliance partners

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Design and Analysis of Opaque Signatures

    Get PDF
    Digital signatures were introduced to guarantee the authenticity and integrity of the underlying messages. A digital signature scheme comprises the key generation, the signature, and the verification algorithms. The key generation algorithm creates the signing and the verifying keys, called also the signer’s private and public keys respectively. The signature algorithm, which is run by the signer, produces a signature on the input message. Finally, the verification algorithm, run by anyone who knows the signer’s public key, checks whether a purported signature on some message is valid or not. The last property, namely the universal verification of digital signatures is undesirable in situations where the signed data is commercially or personally sensitive. Therefore, mechanisms which share most properties with digital signatures except for the universal verification were invented to respond to the aforementioned need; we call such mechanisms “opaque signatures”. In this thesis, we study the signatures where the verification cannot be achieved without the cooperation of a specific entity, namely the signer in case of undeniable signatures, or the confirmer in case of confirmer signatures; we make three main contributions. We first study the relationship between two security properties important for public key encryption, namely data privacy and key privacy. Our study is motivated by the fact that opaque signatures involve always an encryption layer that ensures their opacity. The properties required for this encryption vary according to whether we want to protect the identity (i.e. the key) of the signer or hide the validity of the signature. Therefore, it would be convenient to use existing work about the encryption scheme in order to derive one notion from the other. Next, we delve into the generic constructions of confirmer signatures from basic cryptographic primitives, e.g. digital signatures, encryption, or commitment schemes. In fact, generic constructions give easy-to-understand and easy-to-prove schemes, however, this convenience is often achieved at the expense of efficiency. In this contribution, which constitutes the core of this thesis, we first analyze the already existing constructions; our study concludes that the popular generic constructions of confirmer signatures necessitate strong security assumptions on the building blocks, which impacts negatively the efficiency of the resulting signatures. Next, we show that a small change in these constructionsmakes these assumptions drop drastically, allowing as a result constructions with instantiations that compete with the dedicated realizations of these signatures. Finally, we revisit two early undeniable signatures which were proposed with a conjectural security. We disprove the claimed security of the first scheme, and we provide a fix to it in order to achieve strong security properties. Next, we upgrade the second scheme so that it supports a iii desirable feature, and we provide a formal security treatment of the new scheme: we prove that it is secure assuming new reasonable assumptions on the underlying constituents

    Towards Semantically Enriched Embeddings for Knowledge Graph Completion

    Full text link
    Embedding based Knowledge Graph (KG) Completion has gained much attention over the past few years. Most of the current algorithms consider a KG as a multidirectional labeled graph and lack the ability to capture the semantics underlying the schematic information. In a separate development, a vast amount of information has been captured within the Large Language Models (LLMs) which has revolutionized the field of Artificial Intelligence. KGs could benefit from these LLMs and vice versa. This vision paper discusses the existing algorithms for KG completion based on the variations for generating KG embeddings. It starts with discussing various KG completion algorithms such as transductive and inductive link prediction and entity type prediction algorithms. It then moves on to the algorithms utilizing type information within the KGs, LLMs, and finally to algorithms capturing the semantics represented in different description logic axioms. We conclude the paper with a critical reflection on the current state of work in the community and give recommendations for future directions

    Baghera Assessment Project, designing an hybrid and emergent educational society

    Get PDF
    Edited by Sophie Soury-Lavergne ; Available at: http://www-leibniz.imag.fr/LesCahiers/2003/Cahier81/BAP_CahiersLaboLeibniz.PDFResearch reportThe Baghera Assessment Project (BAP) has the objective to ex plore a new avenue for the design of e-Learning environments. The key features of BAP's approach are: (i) the concept of emergence in multi-agents systems as modelling framework, (ii) the shaping of a new theoretic al framework for modelling student knowledge, namely the cKÂą model. This new model has been constructed, based on the current research in cognitive science and education, to bridge research on education and research on the design of learning environments

    Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    Get PDF
    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic
    • 

    corecore