169,078 research outputs found

    Short Cut Fusion is Correct

    Get PDF
    Fusion is the process of removing intermediate data structures from modularly constructed functional programs. Short cut fusion is a particular fusion technique which uses a single, local transformation rule to fuse compositions of list-processing functions. Short cut fusion has traditionally been treated purely syntactically, and justifications for it have appealed either to intuition or to "free theorems" - even though the latter have not been known to hold in languages supporting higher-order polymorphic functions and fixpoint recursion. In this paper we use Pitts' recent demonstration that contextual equivalence in such languages is parametric to provide the first formal proof of the correctness in short cut fusion for them. In particular, we show that programs which have undergone short cut fusion are contextually equivalent to their unfused counterparts

    Short Cut Fusion: Proved and Improved

    Get PDF
    Short cut fusion is a particular program transformation technique which uses a single, local transformation — called the foldrbuild rule — to remove certain intermediate lists from modularly constructed functional programs. Arguments that short cut fusion is correct typically appeal either to intuition or to “free theorems” — even though the latter have not been known to hold for the languages supporting higher-order polymorphic functions and fixed point recursion in which short cut fusion is usually applied. In this paper we use Pitts’ recent demonstration that contextual equivalence in such languages is relationally parametric to prove that programs in them which have undergone short cut fusion are contextually equivalent to their unfused counterparts. The same techniques in fact yield a much more general result. For each algebraic data type we define a generalization augment of build which constructs substitution instances of its associated data structures. Together with the well-known generalization cata of folder to arbitrary algebraic data types, this allows us to formulate and prove correct for each a contextual equivalence-preserving cata-augment fusion rule. These rules optimize compositions of functions that uniformly consume algebraic data structures with functions that uniformly produce substitution instances of them

    A Generalization of Short-Cut Fusion and Its Correctness Proof

    Get PDF
    Short-cut fusion is a program transformation technique that uses a single local transformation - called the foldr build rule - to remove certain intermediate lists from modularly constructed functional programs. Arguments that short-cut fusion is correct typical appeal either to intuition or to "free theorems" - even though the latter have not been known to hold for the languages supporting higher-order polymorphic functions and fixed point recursion in which short-cut fusion is usually applied. In this paper we use Pitts' recent demonstration that contextual equivalence in such languages is relationally parametric to prove that programs in them which have undergone short-cut fusion are contextually equivalent to their unfused counterparts. For each algebraic data type we then define a generalization of build which constructs substitution instances of its associated data structures, and use Pitts' techniques to prove the correctness of a contextual equivalence-preserving fusion rule which generalizes short-cut fusion. These rules optimize compositions of functions that uniformly consume algebraic data structures with functions that uniformly produces substitution instances of those data structures

    Monadic fold, Monadic build, Monadic Short Cut Fusion

    Get PDF
    Abstract: Short cut fusion improves the efficiency of modularly constructed programs by eliminating intermediate data structures produced by one program component and immediately consumed by another. We define a combinator which expresses uniform production of data structures in monadic contexts, and is the natural counterpart to the well-known monadic fold which consumes them. Like the monadic fold, our new combinator quantifies over monadic algebras rather than standard ones. Together with the monadic fold, it gives rise to a new short cut fusion rule for eliminating intermediate data structures in monadic contexts. This new rule differs significantly from previous short cut fusion rules, all of which are based on combinators which quantify over standard, rather than monadic, algebras. We give examples illustrating the benefits of quantifying over monadic algebras, prove our new fusion rule correct, and show how it can improve programs. We also consider its coalgebraic dual

    Leptoproduction of Heavy Quarks II -- A Unified QCD Formulation of Charged and Neutral Current Processes from Fixed-target to Collider Energies

    Full text link
    A unified QCD formulation of leptoproduction of massive quarks in charged current and neutral current processes is described. This involves adopting consistent factorization and renormalization schemes which encompass both vector-boson-gluon-fusion (flavor creation) and vector-boson-massive-quark-scattering (flavor excitation) production mechanisms. It provides a framework which is valid from the threshold for producing the massive quark (where gluon-fusion is dominant) to the very high energy regime when the typical energy scale \mu is much larger than the quark mass m_Q (where the quark-scattering should be prevalent). This approach effectively resums all large logarithms of the type (alpha_s(mu) log(mu^2/m_Q^2)^n which limit the validity of existing fixed-order calculations to the region mu ~ O(m_Q). We show that the (massive) quark-scattering contribution (after subtraction of overlaps) is important in most parts of the (x, Q) plane except near the threshold region. We demonstrate that the factorization scale dependence of the structure functions calculated in this approach is substantially less than those obtained in the fixed-order calculations, as one would expect from a more consistent formulation.Comment: LaTeX format, 29 pages, 11 figures. Revised to make auto-TeX-abl

    Spatial and Wavenumber Resolution of Doppler Reflectometry

    Full text link
    Doppler reflectometry spatial and wavenumber resolution is analyzed within the framework of the linear Born approximation in slab plasma model. Explicit expression for its signal backscattering spectrum is obtained in terms of wavenumber and frequency spectra of turbulence which is assumed to be radially statistically inhomogeneous. Scattering efficiency for both back and forward scattering (in radial direction) is introduced and shown to be inverse proportional to the square of radial wavenumber of the probing wave at the fluctuation location thus making the spatial resolution of diagnostics sensitive to density profile. It is shown that in case of forward scattering additional localization can be provided by the antenna diagram. It is demonstrated that in case of backscattering the spatial resolution can be better if the turbulence spectrum at high radial wavenumbers is suppressed. The improvement of Doppler reflectometry data localization by probing beam focusing onto the cut-off is proposed and described. The possibility of Doppler reflectometry data interpretation based on the obtained expressions is shown.Comment: http://stacks.iop.org/0741-3335/46/114

    Type-Inference Based Short Cut Deforestation (nearly) without Inlining

    Get PDF
    Deforestation optimises a functional program by transforming it into another one that does not create certain intermediate data structures. In [ICFP'99] we presented a type-inference based deforestation algorithm which performs extensive inlining. However, across module boundaries only limited inlining is practically feasible. Furthermore, inlining is a non-trivial transformation which is therefore best implemented as a separate optimisation pass. To perform short cut deforestation (nearly) without inlining, Gill suggested to split definitions into workers and wrappers and inline only the small wrappers, which transfer the information needed for deforestation. We show that Gill's use of a function build limits deforestation and note that his reasons for using build do not apply to our approach. Hence we develop a more general worker/wrapper scheme without build. We give a type-inference based algorithm which splits definitions into workers and wrappers. Finally, we show that we can deforest more expressions with the worker/wrapper scheme than the algorithm with inlining

    Edible Rocks

    Get PDF
    This lesson has been designed as a comfortable introduction to describing meteorites. It helps students become better observers by making a connection between the familiar (candy bars) and the unfamiliar (meteorites). Edible "rocks" are used in a scientific context, showing students the importance of observation, teamwork and communication skills. In everyday terms, students draw and describe the food. They pair their observations with short descriptions that are in geologic "Field Note" style. As the teacher and class review, appropriate geologic terminology may be substituted by the teacher and subsequently embraced by even very young students. Educational levels: Intermediate elementary, Middle school
    • 

    corecore