141 research outputs found

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Attributing and Referencing (Research) Software: Best Practices and Outlook from Inria

    Get PDF
    Software is a fundamental pillar of modern scientiic research, not only in computer science, but actually across all elds and disciplines. However, there is a lack of adequate means to cite and reference software, for many reasons. An obvious rst reason is software authorship, which can range from a single developer to a whole team, and can even vary in time. The panorama is even more complex than that, because many roles can be involved in software development: software architect, coder, debugger, tester, team manager, and so on. Arguably, the researchers who have invented the key algorithms underlying the software can also claim a part of the authorship. And there are many other reasons that make this issue complex. We provide in this paper a contribution to the ongoing eeorts to develop proper guidelines and recommendations for software citation, building upon the internal experience of Inria, the French research institute for digital sciences. As a central contribution, we make three key recommendations. (1) We propose a richer taxonomy for software contributions with a qualitative scale. (2) We claim that it is essential to put the human at the heart of the evaluation. And (3) we propose to distinguish citation from reference

    Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality

    Get PDF
    We survey diverse approaches to the notion of information: from Shannon entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov complexity are presented: randomness and classification. The survey is divided in two parts published in a same volume. Part II is dedicated to the relation between logic and information system, within the scope of Kolmogorov algorithmic information theory. We present a recent application of Kolmogorov complexity: classification using compression, an idea with provocative implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses how Kolmogorov complexity, besides being a foundation to randomness, is also related to classification. Another approach to classification is also considered: the so-called "Google classification". It uses another original and attractive idea which is connected to the classification using compression and to Kolmogorov complexity from a conceptual point of view. We present and unify these different approaches to classification in terms of Bottom-Up versus Top-Down operational modes, of which we point the fundamental principles and the underlying duality. We look at the way these two dual modes are used in different approaches to information system, particularly the relational model for database introduced by Codd in the 70's. This allows to point out diverse forms of a fundamental duality. These operational modes are also reinterpreted in the context of the comprehension schema of axiomatic set theory ZF. This leads us to develop how Kolmogorov's complexity is linked to intensionality, abstraction, classification and information system.Comment: 43 page

    Reduction of dynamical biochemical reaction networks in computational biology

    Get PDF
    Biochemical networks are used in computational biology, to model the static and dynamical details of systems involved in cell signaling, metabolism, and regulation of gene expression. Parametric and structural uncertainty, as well as combinatorial explosion are strong obstacles against analyzing the dynamics of large models of this type. Multi-scaleness is another property of these networks, that can be used to get past some of these obstacles. Networks with many well separated time scales, can be reduced to simpler networks, in a way that depends only on the orders of magnitude and not on the exact values of the kinetic parameters. The main idea used for such robust simplifications of networks is the concept of dominance among model elements, allowing hierarchical organization of these elements according to their effects on the network dynamics. This concept finds a natural formulation in tropical geometry. We revisit, in the light of these new ideas, the main approaches to model reduction of reaction networks, such as quasi-steady state and quasi-equilibrium approximations, and provide practical recipes for model reduction of linear and nonlinear networks. We also discuss the application of model reduction to backward pruning machine learning techniques

    Gradual computerisation and verification of mathematics : MathLang's path into Mizar

    Get PDF
    There are many proof checking tools that allow capturing mathematical knowledge into formal representation. Those proof systems allow further automatic verifica- tion of the logical correctness of the captured knowledge. However, the process of encoding common mathematical documents in a chosen proof system is still labour- intensive and requires comprehensive knowledge of such system. This makes the use of proof checking tools inaccessible for ordinary mathematicians. This thesis provides a solution for the computerisation of mathematical documents via a num- ber of gradual steps using the MathLang framework. We express the full process of formalisation into the Mizar proof checker. The first levels of such gradual computerisation path have been developing well before the course of this PhD started. The whole project, called MathLang, dates back to 2000 when F. Kamareddine and J.B. Wells started expressing their ideas of novel approach for computerising mathematical texts. They mainly aimed at developing a mathematical framework which is flexible enough to connect existing, in many cases different, approaches of computerisation mathematics, which allows various degrees of formalisation (e.g., partial, full formalisation of chosen parts, or full formalisation of the entire doc- ument), which is compatible with different mathematical foundations (e.g., type theory, set theory, category theory, etc.) and proof systems (e.g., Mizar, Isar, Coq, HOL, Vampire). The first two steps in the gradual formalisation were developed by F. Kamareddine, J.B. Wells and M. Maarek with a small contribution of R. Lamar to the second step. In this thesis we develop the third level of the gradual path, which aims at capturing the rhetorical structure of mathematical documents. We have also integrated further steps of the gradual formalisation, whose final goal is the Mizar system. We present in this thesis a full path of computerisation and formalisation of math- ematical documents into the Mizar proof checker using the MathLang framework. The development of this method was driven by the experience of computerising a number of mathematical documents (covering different authoring styles)

    Model Transformation Languages with Modular Information Hiding

    Get PDF
    Model transformations, together with models, form the principal artifacts in model-driven software development. Industrial practitioners report that transformations on larger models quickly get sufficiently large and complex themselves. To alleviate entailed maintenance efforts, this thesis presents a modularity concept with explicit interfaces, complemented by software visualization and clustering techniques. All three approaches are tailored to the specific needs of the transformation domain

    Good Research Practice in Non-Clinical Pharmacology and Biomedicine

    Get PDF
    This open access book, published under a CC BY 4.0 license in the Pubmed indexed book series Handbook of Experimental Pharmacology, provides up-to-date information on best practice to improve experimental design and quality of research in non-clinical pharmacology and biomedicine

    Engineering automated systems for pharmaceutical manufacturing: quality, regulations and business performance

    Get PDF
    The pharmaceutical sector is very heavily regulated Drug safety regulations form one of the pillars of this regulation. The manufacture of pharmaceuticals is carried out in an environment of onerous regulatory requirements, often from several national and international regulatory bodies. The quality systems operated by drug manufacturers and their regulatory practices have an important impact on product quality. The quality and regulatory requirements apply not only to handling of the medicinal products, but also to the physical and electronic systems used in the manufacture of those products, and extend to automated systems used to support quality assurance operations. Design, development, building and support of such systems are ultimately the responsibility of the drug manufacturer. The quality and regulatory requirements for automated systems are passed down the supply chain to suppliers. In the last two decades of the 20th century there has been a proliferation in the use of computerised and automated systems for use in, or to support manufacturing. Correspondingly, regulatory requirements have been imposed on the manufacturing industry. This work used survey research and factor analysis to establish relationships between quality and regulatory practices, and between both quality and regulatory practices and business performance for suppliers of automated systems into the pharmaceutical market. A survey instrument and an administration strategy were developed from a review of the literature. It was established empirically that quality practices and regulatory practices were strongly related. Specific facets of quality practices and regulatory practices were found to have had a significant impact on both market share and competitiveness expectations and also profit and sales expectations. Differences in practices and performance were established for various levels of automation complexity and criticality, where criticality was a function of the risk the respondent’s system posed to the manufacture of their customer’s products
    corecore