8 research outputs found

    Adventures in the (not so) Complex Space

    No full text
    International audienceWe report on the progress of a constructive mechanization for a small subset of signal processing theory, built upon the SSREFLECT and MATHCOMP libraries.The development was started to provide mechanized semantics for audio programming languages. Currently, we have formalized several standard properties of the Discrete Fourier Transform, such as its unitary matrix form and its power and convolution theorems. Future goals include transfer functions and constant overlap-add processing.At the workshop, we aim to discuss the needs and limits of our current approach, surveying some mathematical concepts not covered by existing libraries, and similar efforts in other frameworks and theorem provers

    Formalization of Transform Methods using HOL Light

    Full text link
    Transform methods, like Laplace and Fourier, are frequently used for analyzing the dynamical behaviour of engineering and physical systems, based on their transfer function, and frequency response or the solutions of their corresponding differential equations. In this paper, we present an ongoing project, which focuses on the higher-order logic formalization of transform methods using HOL Light theorem prover. In particular, we present the motivation of the formalization, which is followed by the related work. Next, we present the task completed so far while highlighting some of the challenges faced during the formalization. Finally, we present a roadmap to achieve our objectives, the current status and the future goals for this project.Comment: 15 Pages, CICM 201

    A Formalisation of a Fast Fourier Transform

    Get PDF
    This notes explains how a standard algorithm that constructs the discrete Fourier transform has been formalised and proved correct in the Coq proof assistant using the SSReflect extension

    An approach for the formal verification of DSP designs using Theorem proving

    Get PDF
    This paper proposes a framework for the incorporation of formal methods in the design flow of digital signal processing (DSP) systems in a rigorous way. In the proposed approach, DSP descriptions were modeled and verified at different abstraction levels using higher order logic based on the higher order logic (HOL) theorem prover. This framework enables the formal verification of DSP designs that in the past could only be done partially using conventional simulation techniques. To this end, a shallow embedding of DSP descriptions in HOL at the floating-point (FP), fixed-point (FXP), behavioral, register transfer level (RTL), and netlist gate levels is provided. The paper made use of existing formalization of FP theory in HOL and a parallel one developed for FXP arithmetic. The high ability of abstraction in HOL allows a seamless hierarchical verification encompassing the whole DSP design path, starting from top-level FP and FXP algorithmic descriptions down to RTL, and gate level implementations. The paper illustrates the new verification framework on the fast Fourier transform (FFT) algorithm as a case study

    Certifying the fast Fourier transform with Coq

    No full text
    Abstract. We program the Fast Fourier Transform in type theory, using the tool Coq. We prove its correctness and the correctness of the Inverse Fourier Transform. A type of trees representing vectors with interleaved elements is defined to facilitate the definition of the transform by structural recursion. We define several operations and proof tools for this data structure, leading to a simple proof of correctness of the algorithm. The inverse transform, on the other hand, is implemented on a different representation of the data, that makes reasoning about summations easier. The link between the two data types is given by an isomorphism. This work is an illustration of the two-level approach to proof development and of the principle of adapting the data representation to the specific algorithm under study. CtCoq, a graphical user interface of Coq, helped in the development. We discuss the characteristics and usefulness of this tool.

    Bases de Grobner desarrollo formal en Coq

    Get PDF
    [Resumen] En primer lugar, se aborda, de forma ajustada a lo que se va a necesitar, las nociones más prácticas del sistema Coq que son necesarias para comprender la formalización de la teoría matemática de los polinomios, su reducción y bases de Gröbner, El trabajo específico comienza con la formalización de los términos de "n" variables, así como las operaciones más usuales de polinomios en el sistema Coq. Se implementa el orden lexicográfico profundizado en la prueba de que dicho orden es bien fundado. Para formalizar el anillo de polinomios en varias variables, se describen las pruebas de la estructura de cuerpo abstracto. Se implementan los polinomios y los polinomios canónicos, formalizando una igualdad explícita de polinomios original, describiendo también sus operaciones. Se implementa el orden de polinomios demostrando que es noetheriano.Asimismo, se implementa el concepto de ideal. Se generaliza, en el sistema de Coq, el algoritmo de la división de polinomios en varias variables. Una vez implementada dicha división, llamada reducción; se estudia la relación entre congruencia y reducción, obteniéndose la forma normal módulo un conjunto de polinomios. Finalmente, se introduce el concepto de base de Gröbner, estudiando y probando su equivalencia con otras caracterizaciones alternativas. Para resolver estas equivalencias se da una versión del Lema de Newman, implementando un esquema de recursión sobre polinomios canónicos, así como propiedades de la confluencia de la reducción

    Erasure in dependently typed programming

    Get PDF
    It is important to reduce the cost of correctness in programming. Dependent types and related techniques, such as type-driven programming, offer ways to do so. Some parts of dependently typed programs constitute evidence of their typecorrectness and, once checked, are unnecessary for execution. These parts can easily become asymptotically larger than the remaining runtime-useful computation, which can cause linear-time algorithms run in exponential time, or worse. It would be unnacceptable, and contradict our goal of reducing the cost of correctness, to make programs run slower by only describing them more precisely. Current systems cannot erase such computation satisfactorily. By modelling erasure indirectly through type universes or irrelevance, they impose the limitations of these means to erasure. Some useless computation then cannot be erased and idiomatic programs remain asymptotically sub-optimal. This dissertation explains why we need erasure, that it is different from other concepts like irrelevance, and proposes two ways of erasing non-computational data. One is an untyped flow-based useless variable elimination, adapted for dependently typed languages, currently implemented in the Idris 1 compiler. The other is the main contribution of the dissertation: a dependently typed core calculus with erasure annotations, full dependent pattern matching, and an algorithm that infers erasure annotations from unannotated (or partially annotated) programs. I show that erasure in well-typed programs is sound in that it commutes with single-step reduction. Assuming the Church-Rosser property of reduction, I show that properties such as Subject Reduction hold, which extends the soundness result to multi-step reduction. I also show that the presented erasure inference is sound and complete with respect to the typing rules; that this approach can be extended with various forms of erasure polymorphism; that it works well with monadic I/O and foreign functions; and that it is effective in that it not only removes the runtime overhead caused by dependent typing in the presented examples, but can also shorten compilation times."This work was supported by the University of St Andrews (School of Computer Science)." -- Acknowledgement
    corecore