4,087 research outputs found

    Scala-Virtualized: Linguistic Reuse for Deep Embeddings

    Get PDF
    Scala-Virtualized extends the Scala language to better support hosting embedded DSLs. Scala is an expressive language that provides a flexible syntax, type-level computation using implicits, and other features that facilitate the development of em- bedded DSLs. However, many of these features work well only for shallow embeddings, i.e. DSLs which are implemented as plain libraries. Shallow embeddings automatically profit from features of the host language through linguistic reuse: any DSL expression is just as a regular Scala expression. But in many cases, directly executing DSL programs within the host language is not enough and deep embeddings are needed, which reify DSL programs into a data structure representation that can be analyzed, optimized, or further translated. For deep embeddings, linguistic reuse is no longer automatic. Scala-Virtualized defines many of the language’s built-in constructs as method calls, which enables DSLs to redefine the built-in semantics using familiar language mechanisms like overloading and overriding. This in turn enables an easier progression from shallow to deep embeddings, as core language constructs such as conditionals or pattern matching can be redefined to build a reified representation of the operation itself. While this facility brings shallow, syntactic, reuse to deep embeddings, we also present examples of what we call deep linguistic reuse: combining shallow and deep components in a single DSL in such a way that certain features are fully implemented in the shallow embedding part and do not need to be reified at the deep embedding level

    Late-bound code generation

    Get PDF
    Each time a function or method is invoked during the execution of a program, a stream of instructions is issued to some underlying hardware platform. But exactly what underlying hardware, and which instructions, is usually left implicit. However in certain situations it becomes important to control these decisions. For example, particular problems can only be solved in real-time when scheduled on specialised accelerators, such as graphics coprocessors or computing clusters. We introduce a novel operator for hygienically reifying the behaviour of a runtime function instance as a syntactic fragment, in a language which may in general differ from the source function definition. Translation and optimisation are performed by recursively invoked, dynamically dispatched code generators. Side-effecting operations are permitted, and their ordering is preserved. We compare our operator with other techniques for pragmatic control, observing that: the use of our operator supports lifting arbitrary mutable objects, and neither requires rewriting sections of the source program in a multi-level language, nor interferes with the interface to individual software components. Due to its lack of interference at the abstraction level at which software is composed, we believe that our approach poses a significantly lower barrier to practical adoption than current methods. The practical efficacy of our operator is demonstrated by using it to offload the user interface rendering of a smartphone application to an FPGA coprocessor, including both statically and procedurally defined user interface components. The generated pipeline is an application-specific, statically scheduled processor-per-primitive rendering pipeline, suitable for place-and-route style optimisation. To demonstrate the compatibility of our operator with existing languages, we show how it may be defined within the Python programming language. We introduce a transformation for weakening mutable to immutable named bindings, termed let-weakening, to solve the problem of propagating information pertaining to named variables between modular code generating units.Open Acces

    A type- and scope-safe universe of syntaxes with binding: their semantics and proofs

    Get PDF
    Almost every programming language's syntax includes a notion of binder and corresponding bound occurrences, along with the accompanying notions of alpha-equivalence, capture-avoiding substitution, typing contexts, runtime environments, and so on. In the past, implementing and reasoning about programming languages required careful handling to maintain the correct behaviour of bound variables. Modern programming languages include features that enable constraints like scope safety to be expressed in types. Nevertheless, the programmer is still forced to write the same boilerplate over again for each new implementation of a scope safe operation (e.g., renaming, substitution, desugaring, printing, etc.), and then again for correctness proofs. We present an expressive universe of syntaxes with binding and demonstrate how to (1) implement scope safe traversals once and for all by generic programming; and (2) how to derive properties of these traversals by generic proving. Our universe description, generic traversals and proofs, and our examples have all been formalised in Agda and are available in the accompanying material available online at https://github.com/gallais/generic-syntax

    Event Loops as First-Class Values: A Case Study in Pedagogic Language Design

    Full text link
    The World model is an existing functional input-output mechanism for event-driven programming. It is used in numerous popular textbooks and curricular settings. The World model conflates two different tasks -- the definition of an event processor and its execution -- into one. This conflation imposes a significant (even unacceptable) burden on student users in several educational settings where we have tried to use it, e.g., for teaching physics. While it was tempting to pile on features to address these issues, we instead used the Scheme language design dictum of removing weaknesses that made them seem necessary. By separating the two tasks above, we arrived at a slightly different primitive, the reactor, as our basis. This only defines the event processor, and a variety of execution operators dictate how it runs. The new design enables programmatic control over event-driven programs. This simplifies reflecting on program behavior, and eliminates many unnecessary curricular dependencies imposed by the old design. This work has been implemented in the Pyret programming language. The separation of concerns has enabled new curricula, such as the Bootstrap:Physics curriculum, to take flight. Thousands of students use this new mechanism every year. We believe that reducing impedance mismatches improves their educational experience

    Frex: dependently-typed algebraic simplification

    Full text link
    We present an extensible, mathematically-structured algebraic simplification library design. We structure the library using universal algebraic concepts: a free algebra -- fral -- and a free extension -- frex -- of an algebra by a set of variables. The library's dependently-typed API guarantees simplification modules, even user-defined ones, are terminating, sound, and complete with respect to a well-specified class of equations. Completeness offers intangible benefits in practice -- our main contribution is the novel design. Cleanly separating between the interface and implementation of simplification modules provides two new modularity axes. First, simplification modules share thousands of lines of infrastructure code dealing with term-representation, pretty-printing, certification, and macros/reflection. Second, new simplification modules can reuse existing ones. We demonstrate this design by developing simplification modules for monoid varieties: ordinary, commutative, and involutive. We implemented this design in the new Idris2 dependently-typed programming language, and in Agda

    The Reflex Sandbox : an experimentation environment for an aspect-oriented Kernel

    Get PDF
    Reflex es un núcleo versátil para la programación orientada aspectos en Java. Provee de las abstracciones básicas, estructurales y de comportamiento, que permiten implementar una variedad de técnicas orientadas a aspectos. Esta tesis estudia dos tópicos fundamentales. En primer lugar, el desarrollo formal, utilizando el lenguaje Haskell, de las construcciones fundamentales del modelo Reflex para reflexión parcial de comportamiento. Este desarrollo abarca el diseño de un lenguaje, llamado Kernel, el cual es una extensión reflexiva de un lenguaje orientado a objetos simple. La semántica operacional del lenguaje Kernel es presentada mediante una máquina de ejecución abstracta. El otro tópico fundamental que estudia esta tesis es validar que el modelo de reflexión parcial de comportamiento es suficientemente expresivo para proveer de semántica a un subconjunto del lenguaje AspectJ. Con este fin, se desarrolló el Reflex Sandbox: un ambiente de experimentación en Haskell para el modelo Reflex. Tanto el desarrollo formal del modelo de reflexión parcial de comportamiento como la validación del soporte de AspectJ, son estudiados en el contexto del Reflex Sandbox. La validación abarca la definición de un lenguaje orientado a aspectos que caracteriza el enfoque de AspectJ a la programación orientada a aspectos, así como la definición de su máquina de ejecución abstracta. También se presenta un compilador que transforma programas escritos en este lenguaje al lenguaje Kernel. Este proceso de compilación provee los fundamentos para entender como dicha transformación puede ser realizada. El proceso de compilación también fue implementado en Java, pero transformando programas AspectJ a programas Reflex. También se presentan mediciones preliminares del desempeño de un programa compilado y ejecutado en Reflex y un programa compilado, y ejecutado con el compilador AspectJ

    Accelerating Verified-Compiler Development with a Verified Rewriting Engine

    Get PDF
    Compilers are a prime target for formal verification, since compiler bugs invalidate higher-level correctness guarantees, but compiler changes may become more labor-intensive to implement, if they must come with proof patches. One appealing approach is to present compilers as sets of algebraic rewrite rules, which a generic engine can apply efficiently. Now each rewrite rule can be proved separately, with no need to revisit past proofs for other parts of the compiler. We present the first realization of this idea, in the form of a framework for the Coq proof assistant. Our new Coq command takes normal proved theorems and combines them automatically into fast compilers with proofs. We applied our framework to improve the Fiat Cryptography toolchain for generating cryptographic arithmetic, producing an extracted command-line compiler that is about 1000×\times faster while actually featuring simpler compiler-specific proofs.Comment: 13th International Conference on Interactive Theorem Proving (ITP 2022

    Stories at work : restorying narratives of new teachers\u27 identity learning in writing studies.

    Get PDF
    Rhetoric and composition has a long, robust history of studying how we train new writing teachers in our graduate/writing programs; yet we lack in-depth inquiries that foreground how new writing teachers learn. This dissertation traces five graduate students learning how to be and become writing teachers, using narrative as an object and means of analysis to study the tacitly internalized process of newcomer professional identity learning. In this project, I enact narrative as a feminist, interdisciplinary methodology to restory new writing teacher research narratives away from implicit deficit or explicit resistance and toward a more generative focus on newcomers’ motivated learning and complex experiences mediated by understandings of teaching, learning, and education that precede, exceed, and infuse the program training and academic literacy histories that our research has historically privileged. Drawing on research in writing studies, education, sociology, and psychology, this dissertation conducts a narrative inquiry into new writing teachers’ identity learning by analyzing stories of teaching and learning elicited from five new writing teachers during a year-long semi-structured, text-based interview study. Using the interplay of thematic and structural analysis of participants’ 248 stories and artifact analysis of participants’ teaching texts, I practice narrative inquiry as an explicitly feminist methodology to destabilize and interrogate what we think we know about new writing teachers’ identities and understandings of learning (as in Chapter Three), experiences and teaching troubles (as in Chapter Four), and motivated desires for the future (as in Chapter Five). I also rely on interdisciplinary theories of learning and identity to understand new teachers as complex people mediated and motivated over time in ways that academic writing/composition theories alone have not adequately illuminated. Ultimately, I argue that new teacher research in writing studies should employ more complex methodologies for studying new writing teachers’ identities as learned and storied over time; and that listening rhetorically to newcomers’ stories and for learning and meaning-making is one way to interrupt unproductive assumptions about newcomer deficit or resistance and to restory our research, administrative, and teaching practices to authorize and encourage more agentive positions from which newcomers (and we all) can learn to act

    Clifford Algebra: A Case for Geometric and Ontological Unification

    Get PDF
    Robert Batterman’s ontological insights (2002, 2004, 2005) are apt: Nature abhors singularities. “So should we,” responds the physicist. However, the epistemic assessments of Batterman concerning the matter prove to be less clear, for in the same vein he write that singularities play an essential role in certain classes of physical theories referring to certain types of critical phenomena. I devise a procedure (“methodological fundamentalism”) which exhibits how singularities, at least in principle, may be avoided within the same classes of formalisms discussed by Batterman. I show that we need not accept some divergence between explanation and reduction (Batterman 2002), or between epistemological and ontological fundamentalism (Batterman 2004, 2005). Though I remain sympathetic to the ‘principle of charity’ (Frisch (2005)), which appears to favor a pluralist outlook, I nevertheless call into question some of the forms such pluralist implications take in Robert Batterman’s conclusions. It is difficult to reconcile some of the pluralist assessments that he and some of his contemporaries advocate with what appears to be a countervailing trend in a burgeoning research tradition known as Clifford (or geometric) algebra. In my critical chapters (2 and 3) I use some of the demonstrated formal unity of Clifford algebra to argue that Batterman (2002) equivocates a physical theory’s ontology with its purely mathematical content. Carefully distinguishing the two, and employing Clifford algebraic methods reveals a symmetry between reduction and explanation that Batterman overlooks. I refine this point by indicating that geometric algebraic methods are an active area of research in computational fluid dynamics, and applied in modeling the behavior of droplet-formation appear to instantiate a “methodologically fundamental” approach. I argue in my introductory and concluding chapters that the model of inter-theoretic reduction and explanation offered by Fritz Rohrlich (1988, 1994) provides the best framework for accommodating the burgeoning pluralism in philosophical studies of physics, with the presumed claims of formal unification demonstrated by physicists choices of mathematical formalisms such as Clifford algebra. I show how Batterman’s insights can be reconstructed in Rohrlich’s framework, preserving Batterman’s important philosophical work, minus what I consider are his incorrect conclusions

    What is globalization? The definitional issue - again

    Get PDF
    Knowledge of globalization is substantially a function of how the concept is defined. After tracing the history of ‘global’ vocabulary, this paper suggests several principles that should inform the way globality (the condition) and globalization (the trend) are defined. On this basis four common conceptions of the term are rejected in favour of a fifth that identifies globalization as the spread of transplanetary – and in recent times more particularly supraterritorial – connections between people. Half a dozen qualifications are incorporated into this definition to distinguish it from globalist exaggerations
    corecore