1,781 research outputs found

    Short-run distributional effects of public education in Greece

    Get PDF
    The present paper examines the short-run distributional impact of public education in Greece using the micro-data of the 2004/5 Household Budget Survey. The aggregate distributional impact of public education is found to be progressive although the incidence varies according to the level of education under examination. In-kind transfers of public education services in the fields of primary and secondary education lead to a considerable decline in relative inequality, whereas transfers in the field of tertiary education appear to have a small distributional impact whose size and sign depend on the treatment of tertiary education students living away from the parental home. When absolute inequality indices are used instead of the relative ones, primary education transfers retain their progressivity, while secondary education transfers appear almost neutral and tertiary education transfers become quite regressive. The main policy implications of the findings are outlined in the concluding section.public education, redistribution

    Non-Linear Catching-up and Long-Run Convergence in the Agricultural Productivity of US States

    Get PDF
    This note investigates convergence of agricultural total factor productivity (TFP) for 48 contiguous states in the US. This is carried out using a recently developed methodology which allows for a clear delineation between catching-up and long-run convergence as well as for the presence of non-linearity in TFP differentials. According to the empirical results, the state TFP dynamics are predominantly long-run converging.Productivity

    The distribution of full income in Greece

    Get PDF
    Non-cash incomes from either private or public sources can have substantial effects on the distribution of economic welfare. However, standard approaches to inequality measurement either neglect them or take into account only selected non-monetary items. Using data for Greece in the mid 2000s we show that it is possible to incorporate a comprehensive list of non-monetary components into the analysis of income inequality. The results indicate that inequality declines sharply when we move from the distribution of disposable monetary income to the distribution of full income, that includes both cash and non-cash incomes. Both private and public non-cash incomes are far more equally distributed than monetary income, but the inequality-reducing effect of publicly provided in-kind services is stronger. The structure of inequality changes when non-cash incomes are included in the concept of resources, but the effects are not dramatic. Non-cash incomes appear to accrue more heavily to younger and older individuals, thus reducing differences across age groups.income distribution, imputed rent, in-kind public transfers

    On the Semantic Approaches to Boolean Grammars

    Get PDF
    Boolean grammars extend context-free grammars by allowing conjunction and negation in rule bodies. This new formalism appears to be quite expressive and still efficient from a parsing point of view. Therefore, it seems reasonable to hope that boolean grammars can lead to more expressive tools that can facilitate the compilation process of modern programming languages. One important aspect concerning the theory of boolean grammars is their semantics. More specifically, the existence of negation makes it difficult to define a simple derivation-style semantics (such as for example in the case of context-free grammars). There have already been proposed a number of different semantic approaches in the literature. The purpose of this paper is to present the basic ideas behind each method and identify certain interesting problems that can be the object of further study in this area

    Code Quality Evaluation Methodology Using The ISO/IEC 9126 Standard

    Full text link
    This work proposes a methodology for source code quality and static behaviour evaluation of a software system, based on the standard ISO/IEC-9126. It uses elements automatically derived from source code enhanced with expert knowledge in the form of quality characteristic rankings, allowing software engineers to assign weights to source code attributes. It is flexible in terms of the set of metrics and source code attributes employed, even in terms of the ISO/IEC-9126 characteristics to be assessed. We applied the methodology to two case studies, involving five open source and one proprietary system. Results demonstrated that the methodology can capture software quality trends and express expert perceptions concerning system quality in a quantitative and systematic manner.Comment: 20 pages, 14 figure

    Interactive Consistency in practical, mostly-asynchronous systems

    Full text link
    Interactive consistency is the problem in which n nodes, where up to t may be byzantine, each with its own private value, run an algorithm that allows all non-faulty nodes to infer the values of each other node. This problem is relevant to critical applications that rely on the combination of the opinions of multiple peers to provide a service. Examples include monitoring a content source to prevent equivocation or to track variability in the content provided, and resolving divergent state amongst the nodes of a distributed system. Previous works assume a fully synchronous system, where one can make strong assumptions such as negligible message delivery delays and/or detection of absent messages. However, practical, real-world systems are mostly asynchronous, i.e., they exhibit only some periods of synchrony during which message delivery is timely, thus requiring a different approach. In this paper, we present a thorough study on practical interactive consistency. We leverage the vast prior work on broadcast and byzantine consensus algorithms to design, implement and evaluate a set of algorithms, with varying timing assumptions and message complexity, that can be used to achieve interactive consistency in real-world distributed systems. We provide a complete, open-source implementation of each proposed interactive consistency algorithm by building a multi-layered stack of protocols that include several broadcast protocols, as well as a binary and a multi-valued consensus protocol. Most of these protocols have never been implemented and evaluated in a real system before. We analyze the performance of our suite of algorithms experimentally by engaging in both single instance and multiple parallel instances of each alternative.Comment: 13 pages, 10 figure

    Mechanisms, Risk Factors, and Management of Acquired Long QT Syndrome: A Comprehensive Review

    Get PDF
    Long QT syndrome is characterized by prolongation of the corrected QT (QTc) interval on the surface electrocardiogram and is associated with precipitation of torsade de pointes (TdP), a polymorphic ventricular tachycardia that may cause sudden death. Acquired long QT syndrome describes pathologic excessive prolongation of the QT interval, upon exposure to an environmental stressor, with reversion back to normal following removal of the stressor. The most common environmental stressor in acquired long QT syndrome is drug therapy. Acquired long QT syndrome is an important issue for clinicians and a significant public health problem concerning the large number of drugs with this adverse effect with a potentially fatal outcome, the large number of patients exposed to these drugs, and our inability to predict the risk for a given individual. In this paper, we focus on mechanisms underlying QT prolongation, risk factors for torsades de pointes and describe the short- and long-term treatment of acquired long QT syndrome
    corecore