23 research outputs found

    Bounded time computation on metric spaces and Banach spaces

    Full text link
    We extend the framework by Kawamura and Cook for investigating computational complexity for operators occurring in analysis. This model is based on second-order complexity theory for functions on the Baire space, which is lifted to metric spaces by means of representations. Time is measured in terms of the length of the input encodings and the required output precision. We propose the notions of a complete representation and of a regular representation. We show that complete representations ensure that any computable function has a time bound. Regular representations generalize Kawamura and Cook's more restrictive notion of a second-order representation, while still guaranteeing fast computability of the length of the encodings. Applying these notions, we investigate the relationship between purely metric properties of a metric space and the existence of a representation such that the metric is computable within bounded time. We show that a bound on the running time of the metric can be straightforwardly translated into size bounds of compact subsets of the metric space. Conversely, for compact spaces and for Banach spaces we construct a family of admissible, complete, regular representations that allow for fast computation of the metric and provide short encodings. Here it is necessary to trade the time bound off against the length of encodings

    Game semantics approach to higher-order complexity

    Get PDF
    Game semantics was initially defined and used to characterize pcf functionals. We use this approach to propose a definition of complexity for such higher-order functions, as well as a class of polynomial time computable higher-order functions

    Events in computation

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:D36018/81 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    The recursion hierarchy for PCF is strict

    Get PDF
    We consider the sublanguages of Plotkin's PCF obtained by imposing some bound k on the levels of types for which fixed point operators are admitted. We show that these languages form a strict hierarchy, in the sense that a fixed point operator for a type of level k can never be defined (up to observational equivalence) using fixed point operators for lower types. This answers a question posed by Berger. Our proof makes substantial use of the theory of nested sequential procedures (also called PCF B\"ohm trees) as expounded in the recent book of Longley and Normann

    Weighted models for higher-order computation

    Get PDF
    We study a class of quantitative models for higher-order computation: Lafont categories with (infinite) biproducts. Each of these has a complete “internal semiring” and can be enriched over its modules. We describe a semantics of nondeterministic PCF weighted over this semiring in which fixed points are obtained from the bifree algebra over its exponential structure. By characterizing them concretely as infinite sums of approximants indexed over nested finite multisets, we prove computational adequacy. We can construct examples of our semantics by weighting existing models such as categories of games over a complete semiring. This transition from qualitative to quantitative semantics is characterized as a “change of base” of enriched categories arising from a monoidal functor from coherence spaces to modules over a complete semiring. For example, the game semantics of Idealized Algol is coherence space enriched and thus gives rise to to a weighted model, which is fully abstract.</p

    Relational Graph Models at Work

    Full text link
    We study the relational graph models that constitute a natural subclass of relational models of lambda-calculus. We prove that among the lambda-theories induced by such models there exists a minimal one, and that the corresponding relational graph model is very natural and easy to construct. We then study relational graph models that are fully abstract, in the sense that they capture some observational equivalence between lambda-terms. We focus on the two main observational equivalences in the lambda-calculus, the theory H+ generated by taking as observables the beta-normal forms, and H* generated by considering as observables the head normal forms. On the one hand we introduce a notion of lambda-K\"onig model and prove that a relational graph model is fully abstract for H+ if and only if it is extensional and lambda-K\"onig. On the other hand we show that the dual notion of hyperimmune model, together with extensionality, captures the full abstraction for H*

    36th International Symposium on Theoretical Aspects of Computer Science: STACS 2019, March 13-16, 2019, Berlin, Germany

    Get PDF

    Three Essays on Poverty Analysis

    Full text link
    This dissertation is a collection of three essays that cover issues in poverty analysis. The first essay (Partial Identification of Poverty Measures with Contaminated and Corrupted Data) applies a partial identification approach to poverty measurement when data errors are non-classical in the sense that it is not assumed that the error is statistically independent of the outcome of interest, and the error distribution has a mass point at zero. This paper shows that it is possible to find non-parametric bounds for the class of additively separable poverty measures. A methodology to draw statistical inference on partially identified parameters is extended and applied to the setting of poverty measurement. The methodology developed in this essay is applied to the estimation of poverty treatment effects of an anti-poverty program in the presence of contaminated data. The second essay (On the Design of an Optimal Transfer Schedule with Time Inconsistent Preferences) addresses a very recent literature that studies public policy and its connection to behavioral economics. It incorporates the phenomenon of time inconsistency into the problem of designing an optimal transfer schedule. It is shown that if program beneficiaries are time inconsistent and receive all of the resources in just one payment, then the equilibrium allocation is always inefficient. In the spirit of the second welfare theorem, I also show that any efficient allocation can be obtained in equilibrium when the policymaker has full information. This assumption is relaxed by introducing uncertainty and asymmetric information into the model. The optimal solution reflects the dilemma that a policymaker has to face when playing the roles of commitment enforcer and insurance provider simultaneously. The third essay (Does Conditionality Generate Heterogeneity and Regressivity in Program Impacts? The Progresa Experience) studies both empirically and theoretically the consequences of introducing a conditional cash transfer scheme for the distribution of program impacts. Intuitively, if the conditioned-on good is normal, then better-off households tend to receive a larger positive impact. I formalize this insight by means of a simple model of child labor, applying the Nash-Bargaining approach as the solution concept. A series of tests for heterogeneity in program impacts are developed and applied to Progresa, an anti-poverty program in Mexico. It can be concluded that this program exhibits a lot of heterogeneity in treatment effects. Consistent with the model, and under the assumption of rank preservation, program impacts are distributionally regressive, although positive, within the treated population.The Consejo de Ciencia y Tecnologia del Estado de Aguascalientes (CONCYTEA), the Consejo Nacional de Ciencia y Tecnologia (CONACYT), and the Ford-MacArthur-Hewlett Foundation Graduate Fellowship in the Social Sciences
    corecore