367,997 research outputs found

    From Self-Interpreters to Normalization by Evaluation

    Get PDF
    We characterize normalization by evaluation as the composition of a self-interpreter with a self-reducer using a special representation scheme, in the sense of Mogensen (1992). We do so by deriving in a systematic way an untyped normalization by evaluation algorithm from a standard interpreter for the ?-calculus. The derived algorithm is not novel and indeed other published algorithms may be obtained in the same manner through appropriate adaptations to the representation scheme

    Typeful Normalization by Evaluation

    Get PDF
    We present the first typeful implementation of Normalization by Evaluation for the simply typed lambda-calculus with sums and control operators: we guarantee type preservation and eta-long (modulo commuting conversions), beta-normal forms using only Generalized Algebraic Data Types in a general-purpose programming language, here OCaml; and we account for sums and control operators with Continuation-Passing Style. First, we implement the standard NbE algorithm for the implicational fragment in a typeful way that is correct by construction. We then derive its call-by-value continuation-passing counterpart, that maps a lambda-term with sums and call/cc into a CPS term in normal form, which we express in a typed dedicated syntax. Beyond showcasing the expressive power of GADTs, we emphasize that type inference gives a smooth way to re-derive the encodings of the syntax and typing of normal forms in Continuation-Passing Style

    Efficient normalization by evaluation

    Get PDF
    International audienceDependently typed theorem provers allow arbitrary terms in types. It is convenient to identify large classes of terms during type checking, hence many such systems provision some form of conversion rule. A standard algorithm for testing the convertibility of two types consists in normalizing them, then testing for syntactic equality of the normal forms. Normalization by evaluation is a standard technique enabling the use of existing compilers and runtimes for functional languages to implement normalizers, without peaking under the hood, for a fast yet cheap system in terms of implementation effort. Our focus is on performance of untyped normalization by evaluation. We demonstrate that with the aid of a standard optimization for higher order programs (namely uncurrying) and the reuse of pattern matching facilities of the evaluator for datatypes, we may obtain a normalizer that evaluates non-functional values about as fast as the underlying evaluator, but as an added benefit can also fully normalize functional values — or to put it another way, partially evaluates functions efficiently

    Context-based Normalization of Histological Stains using Deep Convolutional Features

    Full text link
    While human observers are able to cope with variations in color and appearance of histological stains, digital pathology algorithms commonly require a well-normalized setting to achieve peak performance, especially when a limited amount of labeled data is available. This work provides a fully automated, end-to-end learning-based setup for normalizing histological stains, which considers the texture context of the tissue. We introduce Feature Aware Normalization, which extends the framework of batch normalization in combination with gating elements from Long Short-Term Memory units for normalization among different spatial regions of interest. By incorporating a pretrained deep neural network as a feature extractor steering a pixelwise processing pipeline, we achieve excellent normalization results and ensure a consistent representation of color and texture. The evaluation comprises a comparison of color histogram deviations, structural similarity and measures the color volume obtained by the different methods.Comment: In: 3rd Workshop on Deep Learning in Medical Image Analysis (DLMIA 2017

    Likelihood-preserving normalization in multiple equation models

    Get PDF
    Causal analysis in multiple equation models often revolves around the evaluation of the effects of an exogenous shift in a structural equation. When taking into account the uncertainty implied by the shape of the likelihood, we argue that how normalization is implemented matters for inferential conclusions around the maximum likelihood (ML) estimates of such effects. We develop a general method that eliminates the distortion of finite-sample inferences about these ML estimates after normalization. We show that our likelihood-preserving normalization always maintains coherent economic interpretations while an arbitrary implementation of normalization can lead to ill-determined inferential results.Time-series analysis ; Supply and demand ; Demand for money ; Money supply

    Practical Normalization by Evaluation for EDSLs

    Get PDF
    Embedded domain-specific languages (eDSLs) are typically implemented in a rich host language, such as Haskell, using a combination of deep and shallow embedding techniques. While such a combination enables programmers to exploit the execution mechanism of Haskell to build and specialize eDSL programs, it blurs the distinction between the host language and the eDSL. As a consequence, extension with features such as sums and effects requires a significant amount of ingenuity from the eDSL designer. In this paper, we demonstrate that Normalization by Evaluation (NbE) provides a principled framework for building, extending, and customizing eDSLs. We present a comprehensive treatment of NbE for deeply embedded eDSLs in Haskell that involves a rich set of features such as sums, arrays, exceptions and state, while addressing practical concerns about normalization such as code expansion and the addition of domain-specific features
    corecore