171 research outputs found

    Gradual Certified Programming in Coq

    Full text link
    Expressive static typing disciplines are a powerful way to achieve high-quality software. However, the adoption cost of such techniques should not be under-estimated. Just like gradual typing allows for a smooth transition from dynamically-typed to statically-typed programs, it seems desirable to support a gradual path to certified programming. We explore gradual certified programming in Coq, providing the possibility to postpone the proofs of selected properties, and to check "at runtime" whether the properties actually hold. Casts can be integrated with the implicit coercion mechanism of Coq to support implicit cast insertion a la gradual typing. Additionally, when extracting Coq functions to mainstream languages, our encoding of casts supports lifting assumed properties into runtime checks. Much to our surprise, it is not necessary to extend Coq in any way to support gradual certified programming. A simple mix of type classes and axioms makes it possible to bring gradual certified programming to Coq in a straightforward manner.Comment: DLS'15 final version, Proceedings of the ACM Dynamic Languages Symposium (DLS 2015

    Gunrock: A High-Performance Graph Processing Library on the GPU

    Full text link
    For large-scale graph analytics on the GPU, the irregularity of data access and control flow, and the complexity of programming GPUs have been two significant challenges for developing a programmable high-performance graph library. "Gunrock", our graph-processing system designed specifically for the GPU, uses a high-level, bulk-synchronous, data-centric abstraction focused on operations on a vertex or edge frontier. Gunrock achieves a balance between performance and expressiveness by coupling high performance GPU computing primitives and optimization strategies with a high-level programming model that allows programmers to quickly develop new graph primitives with small code size and minimal GPU programming knowledge. We evaluate Gunrock on five key graph primitives and show that Gunrock has on average at least an order of magnitude speedup over Boost and PowerGraph, comparable performance to the fastest GPU hardwired primitives, and better performance than any other GPU high-level graph library.Comment: 14 pages, accepted by PPoPP'16 (removed the text repetition in the previous version v5

    Trend in ice moistening the stratosphere – constraints from isotope data of water and methane

    Get PDF
    Water plays a major role in the chemistry and radiative budget of the stratosphere. Air enters the stratosphere predominantly in the tropics, where the very low temperatures around the tropopause constrain water vapour mixing ratios to a few parts per million. Observations of stratospheric water vapour show a large positive long-term trend, which can not be explained by change in tropopause temperatures. Trends in the partitioning between vapour and ice of water entering the stratosphere have been suggested to resolve this conundrum. We present measurements of stratospheric H_(2)O, HDO, CH_4 and CH_(3)D in the period 1991–2007 to evaluate this hypothesis. Because of fractionation processes during phase changes, the hydrogen isotopic composition of H_(2)O is a sensitive indicator of changes in the partitioning of vapour and ice. We find that the seasonal variations of H_(2)O are mirrored in the variation of the ratio of HDO to H_(2)O with a slope of the correlation consistent with water entering the stratosphere mainly as vapour. The variability in the fractionation over the entire observation period is well explained by variations in H_(2)O. The isotopic data allow concluding that the trend in ice arising from particulate water is no more than (0.01±0.13) ppmv/decade in the observation period. Our observations suggest that between 1991 and 2007 the contribution from changes in particulate water transported through the tropopause plays only a minor role in altering in the amount of water entering the stratosphere

    Coronary artery fistulas morphology in coronary computed tomography angiography

    Get PDF
    Background: The aim of the study was to evaluate coronary artery fistulas (CAFs) in coronary computed tomography angiography (coronary CTA) and verify whether there is correlation between the fistula’s morphology and other cardiac functional findings and clinical data.Materials and methods: A group of 14,308 patients who were diagnosed in coronary CTA was retrospectively analysed. Achieved data were related to referrals.Results: Coronary artery fistula frequency was 0.43% in the examined population. The assessment of coronary artery disease was the most frequent indication for the examination. In 2 out of 3 cases the diagnosis of CAFs was incidental. Fistulas to cardiac chambers were significantly shorter than those to other vascular structures (19.9 vs. 61.8 mm, respectively, p = 0.001). Pulmonary trunk was most often the drainage site. Fistulas with singular supply and drainage constituted the majority. The new morphologic classification of CAFs was introduced with linear, spiral, aneurysmal, grid-like and mixed types. Most numerous was the spiral type group. Patients with aneurysmal fistulas had a tendency for wider diameter of aorta and pulmonary trunk. Smallest left ventricle fraction was observed in gridlike fistulas (48.0%, comparing to 59.2% for all patients with fistulas, p = 0.001). Concomitant abnormalities were found in 13.1% of CAFs patients.Conclusions: Computed tomography angiography has proven to be a useful tool in CAFs detection and morphological assessment. Proposed classification may simplify the predictions whether fistula has a significant influence on cardiac function; however, further studies are needed

    Automatic Generation of Efficient Linear Algebra Programs

    Full text link
    The level of abstraction at which application experts reason about linear algebra computations and the level of abstraction used by developers of high-performance numerical linear algebra libraries do not match. The former is conveniently captured by high-level languages and libraries such as Matlab and Eigen, while the latter expresses the kernels included in the BLAS and LAPACK libraries. Unfortunately, the translation from a high-level computation to an efficient sequence of kernels is a task, far from trivial, that requires extensive knowledge of both linear algebra and high-performance computing. Internally, almost all high-level languages and libraries use efficient kernels; however, the translation algorithms are too simplistic and thus lead to a suboptimal use of said kernels, with significant performance losses. In order to both achieve the productivity that comes with high-level languages, and make use of the efficiency of low level kernels, we are developing Linnea, a code generator for linear algebra problems. As input, Linnea takes a high-level description of a linear algebra problem and produces as output an efficient sequence of calls to high-performance kernels. In 25 application problems, the code generated by Linnea always outperforms Matlab, Julia, Eigen and Armadillo, with speedups up to and exceeding 10x

    Types from Data: Making Structured Data First-class Citizens in F#

    Get PDF
    Most modern applications interact with external services and access data in structured formats such as XML, JSON and CSV. Static type systems do not understand such formats, often making data access more cumbersome. Should we give up and leave the messy world of external data to dynamic typing and runtime checks? Of course, not! We present F# Data, a library that integrates external structured data into F#. As most real-world data does not come with an explicit schema, we develop a shape inference algorithm that infers a shape from representative sample documents. We then integrate the inferred shape into the F# type system using type providers. We formalize the process and prove a relative type soundness theorem. Our library significantly reduces the amount of data access code and it provides additional safety guarantees when contrasted with the widely used weakly typed techniques

    Graphs in molecular biology

    Get PDF
    Graph theoretical concepts are useful for the description and analysis of interactions and relationships in biological systems. We give a brief introduction into some of the concepts and their areas of application in molecular biology. We discuss software that is available through the Bioconductor project and present a simple example application to the integration of a protein-protein interaction and a co-expression network

    Refinement type contracts for verification of scientific investigative software

    Full text link
    Our scientific knowledge is increasingly built on software output. User code which defines data analysis pipelines and computational models is essential for research in the natural and social sciences, but little is known about how to ensure its correctness. The structure of this code and the development process used to build it limit the utility of traditional testing methodology. Formal methods for software verification have seen great success in ensuring code correctness but generally require more specialized training, development time, and funding than is available in the natural and social sciences. Here, we present a Python library which uses lightweight formal methods to provide correctness guarantees without the need for specialized knowledge or substantial time investment. Our package provides runtime verification of function entry and exit condition contracts using refinement types. It allows checking hyperproperties within contracts and offers automated test case generation to supplement online checking. We co-developed our tool with a medium-sized (≈\approx3000 LOC) software package which simulates decision-making in cognitive neuroscience. In addition to helping us locate trivial bugs earlier on in the development cycle, our tool was able to locate four bugs which may have been difficult to find using traditional testing methods. It was also able to find bugs in user code which did not contain contracts or refinement type annotations. This demonstrates how formal methods can be used to verify the correctness of scientific software which is difficult to test with mainstream approaches
    • 

    corecore