4,030 research outputs found

    Deciding subset relationship of co-inductively defined set constants

    Get PDF
    Static analysis of different non-strict functional programming languages makes use of set constants like Top, Inf, and Bot denoting all expressions, all lists without a last Nil as tail, and all non-terminating programs, respectively. We use a set language that permits union, constructors and recursive definition of set constants with a greatest fixpoint semantics. This paper proves decidability, in particular EXPTIMEcompleteness, of subset relationship of co-inductively defined sets by using algorithms and results from tree automata. This shows decidability of the test for set inclusion, which is required by certain strictness analysis algorithms in lazy functional programming languages

    Enhanced 2-categories and limits for lax morphisms

    Get PDF
    We study limits in 2-categories whose objects are categories with extra structure and whose morphisms are functors preserving the structure only up to a coherent comparison map, which may or may not be required to be invertible. This is done using the framework of 2-monads. In order to characterize the limits which exist in this context, we need to consider also the functors which do strictly preserve the extra structure. We show how such a 2-category of weak morphisms which is "enhanced", by specifying which of these weak morphisms are actually strict, can be thought of as category enriched over a particular base cartesian closed category F. We give a complete characterization, in terms of F-enriched category theory, of the limits which exist in such 2-categories of categories with extra structure.Comment: 77 pages; v2 minor changes only, to appear in Advance

    Partial Horn logic and cartesian categories

    Get PDF
    A logic is developed in which function symbols are allowed to represent partial functions. It has the usual rules of logic (in the form of a sequent calculus) except that the substitution rule has to be modified. It is developed here in its minimal form, with equality and conjunction, as “partial Horn logic”. Various kinds of logical theory are equivalent: partial Horn theories, “quasi-equational” theories (partial Horn theories without predicate symbols), cartesian theories and essentially algebraic theories. The logic is sound and complete with respect to models in , and sound with respect to models in any cartesian (finite limit) category. The simplicity of the quasi-equational form allows an easy predicative constructive proof of the free partial model theorem for cartesian theories: that if a theory morphism is given from one cartesian theory to another, then the forgetful (reduct) functor from one model category to the other has a left adjoint. Various examples of quasi-equational theory are studied, including those of cartesian categories and of other classes of categories. For each quasi-equational theory another, , is constructed, whose models are cartesian categories equipped with models of . Its initial model, the “classifying category” for , has properties similar to those of the syntactic category, but more precise with respect to strict cartesian functors

    A complete proof of the safety of NĂścker's strictness analysis

    Get PDF
    This paper proves correctness of NĂścker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of NĂścker's strictness analysis. Our algorithm SAL is a reformulation of NĂścker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of NĂścker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for NĂścker's strictness analysis in Clean, and also for its use in Haskell

    On the safety of NĂścker's strictness analysis

    Get PDF
    This paper proves correctness of Nocker s method of strictness analysis, implemented for Clean, which is an e ective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt, which addresses correctness of the abstract reduction rules. Our method also addresses the cycle detection rules, which are the main strength of Nocker s strictness analysis. We reformulate Nocker s strictness analysis algorithm in a higherorder lambda-calculus with case, constructors, letrec, and a nondeterministic choice operator used as a union operator. Furthermore, the calculus is expressive enough to represent abstract constants like Top or Inf. The operational semantics is a small-step semantics and equality of expressions is defined by a contextual semantics that observes termination of expressions. The correctness of several reductions is proved using a context lemma and complete sets of forking and commuting diagrams. The proof is based mainly on an exact analysis of the lengths of normal order reductions. However, there remains a small gap: Currently, the proof for correctness of strictness analysis requires the conjecture that our behavioral preorder is contained in the contextual preorder. The proof is valid without referring to the conjecture, if no abstract constants are used in the analysis

    Surviving success : policy reform and the future of industrial pollution in China

    Get PDF
    China's recent industrial growth, a remarkable success story, has been clouded by hundreds of thousands of premature deaths and incidents of serious respiratory illness caused by exposure to industrial air pollution. Seriously contaminated by industrial discharges, many of China's waterways are largely unfit for direct human use. This damage could be substantially reduced at modest cost. Industrial reform combined with stricter environmental regulation has reduced organic water pollution in many areas and has curbed the growth of air pollution. But much higher levels of emissions controls (of particulates and sulfur dioxide) are warranted in China's polluted cities. For the cost-benefit analysis that led to this conclusion, the authors developed threescenarios projecting pollution damage under varying assumptions about future policies. Their findings are: Even if regulation is not tightened further, continued economic reform should have a powerful effect on pollution intensity. Organic water pollution will stabilize in many areas and actually decline in some. Air pollution will continue growing in most areas but at a much slower pace than industrial output. The cost of inaction would be high--most of China's waterways will remain heavily polluted, and many thousands of people will die or suffer serious respiratory damage. Continuing current trends in tightened regulation for water pollution will lead to sharp improvements; adopting an economically feasible policy of much stricter regulation will restore the health of many waterways. The stakes are even higher for air pollution because regulatory enforcement has weakened in many areas in the past five years. Reversing that trend will save many lives at extremely modest cost. China's National Environmental Protection Agency (NEPA) has recommended a tenfold increase in the air pollution levy; adopting NEPA's very conservative recommendation would produce a major turnaround in most cities. For representative Chinese cities, a fiftyfold increase in the levy is probably warranted economically. To be cost-effective, heavy sources of particulate and sulfur dioxide emissions should be targeted for abatement. Reducing emissions from large private plants is so cheap that only significant abatement makes sense -- at least 70 percent abatement of sulfur dioxide particulates and even greater abatement of particulates in large urban industrial facilities.Public Health Promotion,Water and Industry,Environmental Economics&Policies,Pollution Management&Control,Sanitation and Sewerage,Water and Industry,Environmental Economics&Policies,Health Monitoring&Evaluation,Pollution Management&Control,TF030632-DANISH CTF - FY05 (DAC PART COUNTRIES GNP PER CAPITA BELOW USD 2,500/AL

    On variational eigenvalue approximation of semidefinite operators

    Full text link
    Eigenvalue problems for semidefinite operators with infinite dimensional kernels appear for instance in electromagnetics. Variational discretizations with edge elements have long been analyzed in terms of a discrete compactness property. As an alternative, we show here how the abstract theory can be developed in terms of a geometric property called the vanishing gap condition. This condition is shown to be equivalent to eigenvalue convergence and intermediate between two different discrete variants of Friedrichs estimates. Next we turn to a more practical means of checking these properties. We introduce a notion of compatible operator and show how the previous conditions are equivalent to the existence of such operators with various convergence properties. In particular the vanishing gap condition is shown to be equivalent to the existence of compatible operators satisfying an Aubin-Nitsche estimate. Finally we give examples demonstrating that the implications not shown to be equivalences, indeed are not.Comment: 26 page
    • …
    corecore