155 research outputs found

    Superdevelopments for Weak Reduction

    Full text link
    We study superdevelopments in the weak lambda calculus of Cagman and Hindley, a confluent variant of the standard weak lambda calculus in which reduction below lambdas is forbidden. In contrast to developments, a superdevelopment from a term M allows not only residuals of redexes in M to be reduced but also some newly created ones. In the lambda calculus there are three ways new redexes may be created; in the weak lambda calculus a new form of redex creation is possible. We present labeled and simultaneous reduction formulations of superdevelopments for the weak lambda calculus and prove them equivalent

    Distilling Abstract Machines (Long Version)

    Full text link
    It is well-known that many environment-based abstract machines can be seen as strategies in lambda calculi with explicit substitutions (ES). Recently, graphical syntaxes and linear logic led to the linear substitution calculus (LSC), a new approach to ES that is halfway between big-step calculi and traditional calculi with ES. This paper studies the relationship between the LSC and environment-based abstract machines. While traditional calculi with ES simulate abstract machines, the LSC rather distills them: some transitions are simulated while others vanish, as they map to a notion of structural congruence. The distillation process unveils that abstract machines in fact implement weak linear head reduction, a notion of evaluation having a central role in the theory of linear logic. We show that such a pattern applies uniformly in call-by-name, call-by-value, and call-by-need, catching many machines in the literature. We start by distilling the KAM, the CEK, and the ZINC, and then provide simplified versions of the SECD, the lazy KAM, and Sestoft's machine. Along the way we also introduce some new machines with global environments. Moreover, we show that distillation preserves the time complexity of the executions, i.e. the LSC is a complexity-preserving abstraction of abstract machines.Comment: 63 page

    A Strong Distillery

    Get PDF
    Abstract machines for the strong evaluation of lambda-terms (that is, under abstractions) are a mostly neglected topic, despite their use in the implementation of proof assistants and higher-order logic programming languages. This paper introduces a machine for the simplest form of strong evaluation, leftmost-outermost (call-by-name) evaluation to normal form, proving it correct, complete, and bounding its overhead. Such a machine, deemed Strong Milner Abstract Machine, is a variant of the KAM computing normal forms and using just one global environment. Its properties are studied via a special form of decoding, called a distillation, into the Linear Substitution Calculus, neatly reformulating the machine as a standard micro-step strategy for explicit substitutions, namely linear leftmost-outermost reduction, i.e., the extension to normal form of linear head reduction. Additionally, the overhead of the machine is shown to be linear both in the number of steps and in the size of the initial term, validating its design. The study highlights two distinguished features of strong machines, namely backtracking phases and their interactions with abstractions and environments.Comment: Accepted at APLAS 201

    Reductions in Higher-Order Rewriting and Their Equivalence

    Get PDF
    Proof terms are syntactic expressions that represent computations in term rewriting. They were introduced by Meseguer and exploited by van Oostrom and de Vrijer to study {\em equivalence of reductions} in (left-linear) first-order term rewriting systems. We study the problem of extending the notion of proof term to {\em higher-order rewriting}, which generalizes the first-order setting by allowing terms with binders and higher-order substitution. In previous works that devise proof terms for higher-order rewriting, such as Bruggink's, it has been noted that the challenge lies in reconciling composition of proof terms and higher-order substitution (β\beta-equivalence). This led Bruggink to reject ``nested'' composition, other than at the outermost level. In this paper, we propose a notion of higher-order proof term we dub \emph{rewrites} that supports nested composition. We then define {\em two} notions of equivalence on rewrites, namely {\em permutation equivalence} and {\em projection equivalence}, and show that they coincide

    Determining Lack of Marketability Discounts: Employing an Equity Collar

    Get PDF
    A discount for the lack of marketability is the implicit cost of quickly monetizing a non-marketable asset at its current value. These discounts are used in many venues to determine the fair market value of a non-marketable asset such as a privately-held business. There has been much written on the quantification of the discount for the lack of the marketability which is briefly summarized in this article. Marketability refers to monetizing the non-marketable asset at its cash equivalent current value. Current practice often uses the cost of a put option as a proxy for the discount. A put option insures that the investor will receive no less than the current value of the underlying asset. However, the use of a put also allows the investor to maintain the asset’s upside potential. Therefore, the cost of a put overstates the discount for the lack of marketability. We show that the cost of monetizing a non-marketable asset at its current value through a loan, secured by an at-the-money equity collar, more effectively captures the true cost of marketability. When puts and calls cannot be employed to secure the current value on the underlying asset, a portfolio consisting of the non-marketable asset and a stock index, where puts and calls can be written on the index can be constructed. The effectiveness of the portfolio in creating a risk free outcome depends upon the correlation and volatility of the stock index and the non-marketable asset. We demonstrate that, relative to current practice, the use of an equity collar with a loan greatly reduces the implied discount for the lack of marketability

    Reductions in Higher-Order Rewriting and Their Equivalence

    Get PDF

    Two Decreasing Measures for Simply Typed ?-Terms

    Get PDF
    This paper defines two decreasing measures for terms of the simply typed ?-calculus, called the ?-measure and the ?^{?}-measure. A decreasing measure is a function that maps each typable ?-term to an element of a well-founded ordering, in such a way that contracting any ?-redex decreases the value of the function, entailing strong normalization. Both measures are defined constructively, relying on an auxiliary calculus, a non-erasing variant of the ?-calculus. In this system, dubbed the ?^{?}-calculus, each ?-step creates a "wrapper" containing a copy of the argument that cannot be erased and cannot interact with the context in any other way. Both measures rely crucially on the observation, known to Turing and Prawitz, that contracting a redex cannot create redexes of higher degree, where the degree of a redex is defined as the height of the type of its ?-abstraction. The ?-measure maps each ?-term to a natural number, and it is obtained by evaluating the term in the ?^{?}-calculus and counting the number of remaining wrappers. The ?^{?}-measure maps each ?-term to a structure of nested multisets, where the nesting depth is proportional to the maximum redex degree

    Proofs and Refutations for Intuitionistic and Second-Order Logic

    Get PDF
    The ?^{PRK}-calculus is a typed ?-calculus that exploits the duality between the notions of proof and refutation to provide a computational interpretation for classical propositional logic. In this work, we extend ?^{PRK} to encompass classical second-order logic, by incorporating parametric polymorphism and existential types. The system is shown to enjoy good computational properties, such as type preservation, confluence, and strong normalization, which is established by means of a reducibility argument. We identify a syntactic restriction on proofs that characterizes exactly the intuitionistic fragment of second-order ?^{PRK}, and we study canonicity results

    Control Premiums and the Value of the Closely-Held Firm

    Get PDF
    This paper demonstrates that control premiums are warranted in the valuation of closely-held firms when perquisites exist. The value of control is a function of the ownership structure and the size of perquisite cash flows. The conventional logic of assigning control premiums based upon transactions in the public market is shown to be flawed. A statistic for calculating control premiums based on ownership structure and the size of perquisite flows is developed. The paper closes with a short discussion of how minority discounts and control premiums are related
    • …
    corecore