2,635 research outputs found

    Canonizing Graphs of Bounded Tree Width in Logspace

    Get PDF
    Graph canonization is the problem of computing a unique representative, a canon, from the isomorphism class of a given graph. This implies that two graphs are isomorphic exactly if their canons are equal. We show that graphs of bounded tree width can be canonized by logarithmic-space (logspace) algorithms. This implies that the isomorphism problem for graphs of bounded tree width can be decided in logspace. In the light of isomorphism for trees being hard for the complexity class logspace, this makes the ubiquitous class of graphs of bounded tree width one of the few classes of graphs for which the complexity of the isomorphism problem has been exactly determined.Comment: 26 page

    LNCS

    Get PDF
    This paper presents a foundation for refining concurrent programs with structured control flow. The verification problem is decomposed into subproblems that aid interactive program development, proof reuse, and automation. The formalization in this paper is the basis of a new design and implementation of the Civl verifier

    A method for maintaining new software

    Get PDF
    This thesis describes a novel method for perfective maintenance of software which has been developed from specifications using formal transformations. The list of applied transformations provides a suitable derivation history to use when changes are made to the software. The method uses transformations which have been implemented in a tool called the Maintainer's Assistant for the purposes of restructuring code. The method uses these transformations for refinement. Comparisons are made between sequential transformations, refinement calculi and standard proof based refinement techniques for providing a suitable derivation history to use when changes are made in the requirements of a system. Two case studies are presented upon which these comparisons are based and on which the method is tested. Criteria such as saleability, speed, ease, design improvements and software quality is used to argue that transformations are a more favourable basis of refinement. Metrics are used to evaluate the complexity of the code developed using the method. Conclusions of how to develop different types of specifications into code and on how best to apply various changes are presented. An approach which is recommended is to use transformations for splitting the specification so that original refinement paths can still be used. Using transformations for refining a specification and recording this path produces software of a better structure and of higher maintainability. Having such a path improves the speed and ease of future alterations to the system. This is more cost effective than redeveloping the software from a new specification

    SPADE: Verification of Multithreaded Dynamic and Recursive Programs

    Get PDF
    International audienceThe tool SPADE allows to analyse automatically boolean programs with parallelism, communication between parallel processes, dynamic process creation, and recursion at the same time. As far as we know, this is the first software model checking tool based on an expressive model that accurately models all these aspects in programs

    A 2007 Model Curriculum for a Liberal Arts Degree in Computer Science

    Get PDF
    In 1986, guidelines for a computer science major degree program offered in the context of the liberal arts were developed by the Liberal Arts Computer Science Consortium (LACS) [4]. In 1996 the same group offered a revised curriculum reflecting advances in the discipline, the accompanying technology, and teaching pedagogy [6]. In each case, the LACS models represented, at least in part, a response to the recommendations of the ACM/IEEE-CS [1][2]. Continuing change in the discipline, technology, and pedagogy coupled with the appearance of Computing Curriculum 2001 [3] have led to the 2007 Model Curriculum described here. This report begins by considering just what computer science is and what goals are appropriate for the study of computer science in the landscape of the liberal arts. A curricular model for this setting follows, updating the 1996 revision. As in previous LACS curricula, [4] and [6], the model is practical; that is, students can schedule it, it can be taught with a relatively small size faculty, and it contributes to the foundation of an excellent liberal arts education. Finally, this 2007 Model Curriculum is compared with the recommendations of CC2001 [3]

    Laws of mission-based programming

    Get PDF

    Generalized Lattice Boltzmann Method with multi-range pseudo-potential

    Get PDF
    The physical behaviour of a class of mesoscopic models for multiphase flows is analyzed in details near interfaces. In particular, an extended pseudo-potential method is developed, which permits to tune the equation of state and surface tension independently of each other. The spurious velocity contributions of this extended model are shown to vanish in the limit of high grid refinement and/or high order isotropy. Higher order schemes to implement self-consistent forcings are rigorously computed for 2d and 3d models. The extended scenario developed in this work clarifies the theoretical foundations of the Shan-Chen methodology for the lattice Boltzmann method and enhances its applicability and flexibility to the simulation of multiphase flows to density ratios up to O(100)

    Visualization and user interactions in RDF data representation

    Get PDF
    The spreading of linked data in digital technologies creates the need to develop new approaches to handle this kind of data. The modern trends in the information technology encourage usage of human-friendly interfaces and graphical tools, which helps users to understand the system and speeds up the work processes. In this study my goal is to develop a set of best practices for solving the problem of visualizations and interactions with linked data and to create a working prototype based on this practices. My work is a part of a project developed by Fail-Safe IT Solutions Oy. During the research process I study various existing products that try to solve the problem of human-friendly interactions with linked data, compare them and based on the comparison develop my own approach for solving the problem in the given environment, which satisfies the provided specifications. The key findings of the research can be grouped in two categories. The first category of findings is based on the existing solution examinations and is related to the features I find beneficial to the project. The second category is based on the experience acquired during the project development and includes environment-specific and project-related findings

    Opening the black box of knowledge management mechanisms: exploring knowledge flows at a consultancy

    Get PDF
    Purpose – Based on an exploratory case-based approach, the purpose of this paper is to open the KM black box and examine the relationships that link knowledge management (KM) inputs (i.e. knowledge resources and KM practices) via knowledge processes to KM performance. This paper aims to identify the underlying mechanisms and explain how KM performance is enabled. Design/methodology/approach – This in-depth case study conducted at a medium-sized consultancy in the supply chain management industry empirically examines knowledge flows to uncover the relationships between KM inputs, knowledge processes and KM performance. We adopt the viable system model (VSM) as a theoretical lens to identify KM mechanisms. Findings – By identifying six KM mechanisms, we contribute to the theoretical understanding of how KM inputs are interconnected and lead to KM performance via knowledge processes. Originality/value – Based on the insights gained, we provide propositions that organizations should consider in designing viable KM. Our findings help organizations in understanding their KM with the help of knowledge flow analysis and identifying how critical KM elements are interconnected
    • …
    corecore