1,266 research outputs found

    Formalization of the fundamental group in untyped set theory using auto2

    Full text link
    We present a new framework for formalizing mathematics in untyped set theory using auto2. Using this framework, we formalize in Isabelle/FOL the entire chain of development from the axioms of set theory to the definition of the fundamental group for an arbitrary topological space. The auto2 prover is used as the sole automation tool, and enables succinct proof scripts throughout the project.Comment: 17 pages, accepted for ITP 201

    Effect of age and cytoskeletal elements on the indentation-dependent mechanical properties of chondrocytes.

    Get PDF
    Articular cartilage chondrocytes are responsible for the synthesis, maintenance, and turnover of the extracellular matrix, metabolic processes that contribute to the mechanical properties of these cells. Here, we systematically evaluated the effect of age and cytoskeletal disruptors on the mechanical properties of chondrocytes as a function of deformation. We quantified the indentation-dependent mechanical properties of chondrocytes isolated from neonatal (1-day), adult (5-year) and geriatric (12-year) bovine knees using atomic force microscopy (AFM). We also measured the contribution of the actin and intermediate filaments to the indentation-dependent mechanical properties of chondrocytes. By integrating AFM with confocal fluorescent microscopy, we monitored cytoskeletal and biomechanical deformation in transgenic cells (GFP-vimentin and mCherry-actin) under compression. We found that the elastic modulus of chondrocytes in all age groups decreased with increased indentation (15-2000 nm). The elastic modulus of adult chondrocytes was significantly greater than neonatal cells at indentations greater than 500 nm. Viscoelastic moduli (instantaneous and equilibrium) were comparable in all age groups examined; however, the intrinsic viscosity was lower in geriatric chondrocytes than neonatal. Disrupting the actin or the intermediate filament structures altered the mechanical properties of chondrocytes by decreasing the elastic modulus and viscoelastic properties, resulting in a dramatic loss of indentation-dependent response with treatment. Actin and vimentin cytoskeletal structures were monitored using confocal fluorescent microscopy in transgenic cells treated with disruptors, and both treatments had a profound disruptive effect on the actin filaments. Here we show that disrupting the structure of intermediate filaments indirectly altered the configuration of the actin cytoskeleton. These findings underscore the importance of the cytoskeletal elements in the overall mechanical response of chondrocytes, indicating that intermediate filament integrity is key to the non-linear elastic properties of chondrocytes. This study improves our understanding of the mechanical properties of articular cartilage at the single cell level

    An implementation of Deflate in Coq

    Full text link
    The widely-used compression format "Deflate" is defined in RFC 1951 and is based on prefix-free codings and backreferences. There are unclear points about the way these codings are specified, and several sources for confusion in the standard. We tried to fix this problem by giving a rigorous mathematical specification, which we formalized in Coq. We produced a verified implementation in Coq which achieves competitive performance on inputs of several megabytes. In this paper we present the several parts of our implementation: a fully verified implementation of canonical prefix-free codings, which can be used in other compression formats as well, and an elegant formalism for specifying sophisticated formats, which we used to implement both a compression and decompression algorithm in Coq which we formally prove inverse to each other -- the first time this has been achieved to our knowledge. The compatibility to other Deflate implementations can be shown empirically. We furthermore discuss some of the difficulties, specifically regarding memory and runtime requirements, and our approaches to overcome them

    Effect of Wood Particle Size on Fungal Growth in a Model Biomechanical Pulping Process

    Get PDF
    The pretreatment of aspen wood chips with white-rot fungus has been evaluated as a way of making biomechanical pulp. Our study addressed (1) whether wood particle size (chip size) affects the growth pattern of the attacking organism, and (2) whether the difference in particle size between chips and coarse pulp is related to the availability of wood polymers to the fungus. We qualitatively evaluated the growth of Phanerochaete chrysosporium BKM-F-1767 on aspen wood using standard industrial 6- and 19-mm chips and coarse refiner mechanical pulp. Scanning electron microscopy revealed a slight increase in the number of hyphae in the 19-mm chips compared to that in the 6-mm chips, but no major morphological differences in cellulose or lignin loss. Dense aerial hyphal growth occurred around the chips, but not around the coarse pulp. The fungus appeared to attack the coarse pulp from both outside and within the fiber wall. Hyphae within both the middle lamella and the cell lumina attacked the cell walls. The fungus eroded the chip cell walls and their constituents primarily from the wood cell lumen outward. After only 3 weeks of fungal treatment, both chips and coarse pulp showed marked localized cell-wall thinning and fragmentation as well as generalized swelling and relaxing of the normally rigid cell-wall structure. We conclude that particle size has only a minor effect on fungal growth on wood under conditions such as those likely to be used in a commercial biopulping process

    Witnessing (co)datatypes

    Get PDF
    Datatypes and codatatypes are useful for specifying and reasoning about (possibly infinite) computational processes. The Isabelle/HOL proof assistant has recently been extended with a definitional package that supports both. We describe a complete procedure for deriving nonemptiness witnesses in the general mutually recursive, nested case—nonemptiness being a proviso for introducing types in higher-order logic

    Beyond the benchtop and the benthos: Dataset management planning and design for time series of ocean carbonate chemistry associated with Durafet (R)-based pH sensors

    Get PDF
    To better understand the impact of ocean acidification on marine ecosystems, an important ongoing research priority for marine scientists is to characterize present-day pH variability. Following recent technological advances, autonomous pH sensor deployments in shallow coastal marine environments have revealed that pH dynamics in coastal oceans are more variable in space and time than the discrete, open-ocean measurements that are used for ocean acidification projections. Data from these types of deployments will benefit the research community by facilitating the improved design of ocean acidification studies as well as the identification or evaluation of natural and human-influenced pH variability. Importantly, the collection of ecologically relevant pH data and a cohesive, user-friendly integration of results across sites and regions requires (1) effective sensor operation to ensure high quality pH data collection and (2) efficient data management for accessibility and broad reuse by the marine science community. Here, we review the best practices for deployment, calibration, and data processing and quality control, using our experience with Durafet (R)-based pH sensors as a model. Next, we describe information management practices for streamlining preservation and distribution of data and for cataloging different types of pH sensor data, developed in collaboration with two U.S. Long Term Ecological Research (LTER) sites. Finally, we assess sensor performance and data recovery from 73 SeaFET deployments in the Santa Barbara Channel using our quality control guidelines and data management tools, and offer recommendations for improved data yields. Our experience provides a template for other groups contemplating using SeaFET technology as well as general steps that may be helpful for the design of data management for other complex sensors. (C) 2016 The Authors. Published by Elsevier B.V

    Isabelle/DOF: Design and Implementation

    Get PDF
    This is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this record17th International Conference, SEFM 2019 Oslo, Norway, September 18–20, 2019DOF is a novel framework for defining ontologies and enforcing them during document development and evolution. A major goal of DOF is the integrated development of formal certification documents (e. g., for Common Criteria or CENELEC 50128) that require consistency across both formal and informal arguments. To support a consistent development of formal and informal parts of a document, we provide Isabelle/DOF, an implementation of DOF on top of the formal methods framework Isabelle/HOL. A particular emphasis is put on a deep integration into Isabelleâs IDE, which allows for smooth ontology development as well as immediate ontological feedback during the editing of a document. In this paper, we give an in-depth presentation of the design concepts of DOFâs Ontology Definition Language (ODL) and key aspects of the technology of its implementation. Isabelle/DOF is the first ontology language supporting machine-checked links between the formal and informal parts in an LCF-style interactive theorem proving environment. Sufficiently annotated, large documents can easily be developed collabo- ratively, while ensuring their consistency, and the impact of changes (in the formal and the semi-formal content) is tracked automatically.IRT SystemX, Paris-Saclay, Franc

    Algebraic Principles for Rely-Guarantee Style Concurrency Verification Tools

    Full text link
    We provide simple equational principles for deriving rely-guarantee-style inference rules and refinement laws based on idempotent semirings. We link the algebraic layer with concrete models of programs based on languages and execution traces. We have implemented the approach in Isabelle/HOL as a lightweight concurrency verification tool that supports reasoning about the control and data flow of concurrent programs with shared variables at different levels of abstraction. This is illustrated on two simple verification examples

    A Unifying Model of Genome Evolution Under Parsimony

    Get PDF
    We present a data structure called a history graph that offers a practical basis for the analysis of genome evolution. It conceptually simplifies the study of parsimonious evolutionary histories by representing both substitutions and double cut and join (DCJ) rearrangements in the presence of duplications. The problem of constructing parsimonious history graphs thus subsumes related maximum parsimony problems in the fields of phylogenetic reconstruction and genome rearrangement. We show that tractable functions can be used to define upper and lower bounds on the minimum number of substitutions and DCJ rearrangements needed to explain any history graph. These bounds become tight for a special type of unambiguous history graph called an ancestral variation graph (AVG), which constrains in its combinatorial structure the number of operations required. We finally demonstrate that for a given history graph GG, a finite set of AVGs describe all parsimonious interpretations of GG, and this set can be explored with a few sampling moves.Comment: 52 pages, 24 figure

    Efficient Certified Resolution Proof Checking

    Get PDF
    We present a novel propositional proof tracing format that eliminates complex processing, thus enabling efficient (formal) proof checking. The benefits of this format are demonstrated by implementing a proof checker in C, which outperforms a state-of-the-art checker by two orders of magnitude. We then formalize the theory underlying propositional proof checking in Coq, and extract a correct-by-construction proof checker for our format from the formalization. An empirical evaluation using 280 unsatisfiable instances from the 2015 and 2016 SAT competitions shows that this certified checker usually performs comparably to a state-of-the-art non-certified proof checker. Using this format, we formally verify the recent 200 TB proof of the Boolean Pythagorean Triples conjecture
    • …
    corecore