206 research outputs found

    MalStone: Towards A Benchmark for Analytics on Large Data Clouds

    Full text link
    Developing data mining algorithms that are suitable for cloud computing platforms is currently an active area of research, as is developing cloud computing platforms appropriate for data mining. Currently, the most common benchmark for cloud computing is the Terasort (and related) benchmarks. Although the Terasort Benchmark is quite useful, it was not designed for data mining per se. In this paper, we introduce a benchmark called MalStone that is specifically designed to measure the performance of cloud computing middleware that supports the type of data intensive computing common when building data mining models. We also introduce MalGen, which is a utility for generating data on clouds that can be used with MalStone

    Principals in Programming Languages: A Syntactic Proof Technique

    Get PDF
    Programs are often structured around the idea that different pieces of code comprise distinct principals, each with a view of its environment. Typical examples include the modules of a large program, a host and its clients, or a collection of interactive agents.In this paper, we formalize this notion of principal in the programming language itself. The result is a language in which intuitive statements such as, "the client must call open to obtain a file handle," can be phrased and proven formally.We add principals to variants of the simply-typed λ-calculus and show how we can track the code corresponding to each principal throughout evaluation. This multiagent calculus yields syntactic proofs of some type abstraction properties that traditionally require semantic arguments.Engineering and Applied Science

    SensorWeb Evolution Using the Earth Observing One (EO-1) Satellite as a Test Platform

    Get PDF
    The Earth Observing One (EO-1) satellite was launched in November 2000 as a one year technology demonstration mission for a variety of space technologies. After the first year, in addition to collecting science data from its instruments, the EO-1 mission has been used as a testbed for a variety of technologies which provide various automation capabilities and which have been used as a pathfinder for the creation of SensorWebs. A SensorWeb is the integration of variety of space, airborne and ground sensors into a loosely coupled collaborative sensor system that automatically provides useful data products. Typically, a SensorWeb is comprised of heterogeneous sensors tied together with a messaging architecture and web services. This paper provides an overview of the various technologies that were tested and eventually folded into normal operations. As these technologies were folded in, the nature of operations transformed. The SensorWeb software enables easy connectivity for collaboration with sensors, but the side benefit is that it improved the EO-1 operational efficiency. This paper presents the various phases of EO-1 operation over the past 12 years and also presents operational efficiency gains demonstrated by some metrics

    Mining data from 1000 genomes to identify the causal variant in regions under positive selection

    Get PDF
    The human genome contains hundreds of regions in which the patterns of genetic variation indicate recent positive natural selection, yet for most of these the underlying gene and the advantageous mutation remain unknown. We recently reported the development of a method, Composite of Multiple Signals (CMS), that combines tests for multiple signals of natural selection and increases resolution by up to 100-fold

    Arquitectura de información para sitios de gran porte

    Full text link

    Political Cleavages within Industry: Firm level lobbying for Trade Liberalization ∗

    Get PDF
    [Work in Progress] Existing political economy models rely on inter-industry differences such as factor endowment or factor specificity to explain the politics of trade policy-making. However, this paper finds that a large proportion of variation in U.S. applied tariff rates in fact arises within industry. I offer a theory of trade liberalization that explains how product differentiation in economic markets leads to firm-level lobbying in political markets. I argue that while high product differentiation eliminates the collective action problem exporting firms confront, political objections to product-specific liberalization will decline due to less substitutability and the possibility of serving foreign markets based on the norms of reciprocity. To test this argument, I construct a new dataset on lobbying by all publicly traded manufacturing firms after parsing all 838,588 lobbying reports filed under the Lobbying Disclosure Act of 1995. I find that productive exporting firms are more likely to lobby to reduce tariffs, especially when their products are sufficiently differentiated. I also find that highly differentiated products have lower tariff rates. The results challenge the common focus on industry-level lobbying for protection

    Half a Century of Wilson & Jungner: Reflections on the Governance of Population Screening.

    Get PDF
    Background: In their landmark report on the "Principles and Practice of Screening for Disease" (1968), Wilson and Jungner noted that the practice of screening is just as important for securing beneficial outcomes and avoiding harms as the formulation of principles. Many jurisdictions have since established various kinds of "screening governance organizations" to provide oversight of screening practice. Yet to date there has been relatively little reflection on the nature and organization of screening governance itself, or on how different governance arrangements affect the way screening is implemented and perceived and the balance of benefits and harms it delivers. Methods: An international expert policy workshop convened by Sturdy, Miller and Hogarth. Results: While effective governance is essential to promote beneficial screening practices and avoid attendant harms, screening governance organizations face enduring challenges. These challenges are social and ethical as much as technical. Evidence-based adjudication of the benefits and harms of population screening must take account of factors that inform the production and interpretation of evidence, including the divergent professional, financial and personal commitments of stakeholders. Similarly, when planning and overseeing organized screening programs, screening governance organizations must persuade or compel multiple stakeholders to work together to a common end. Screening governance organizations in different jurisdictions vary widely in how they are constituted, how they relate to other interested organizations and actors, and what powers and authority they wield. Yet we know little about how these differences affect the way screening is implemented, and with what consequences. Conclusions: Systematic research into how screening governance is organized in different jurisdictions would facilitate policy learning to address enduring challenges. Even without such research, informal exchange and sharing of experiences between screening governance organizations can deliver invaluable insights into the social as well as the technical aspects of governance

    Insurance-induced moral hazard: A dynamic model of within-year medical care decision making under uncertainty

    Get PDF
    Abstract Insurance-induced moral hazard may lead individuals to overconsume medical care. Many studies estimate this overconsumption using models that aggregate medical care decisions up to the annual level. Using employer-employee matched data from the Medical Expenditure Panel Survey (MEPS), I estimate the effect of moral hazard on medical care expenditure using a dynamic model of within-year medical care consumption that allows for endogenous health transitions, variation in medical care prices, and individual uncertainty within a health insurance year. I then calculate moral hazard effects under a second set of conditions that are consistent with the assumptions of most annual decision-making models. The within-year decision-making model produces a moral hazard effect that is 24% larger than the alternative model. I also provide evidence of heterogeneous moral hazard effects, particularly between insured and uninsured individuals, and discuss related policy implications. The paper concludes with a counterfactual policy simulation that implements the individual mandate provision of the 2010 Patient Protection and Affordable Care Act. I find that full implementation of the individual mandate decreases the percentage of uninsured individuals in the population being analyzed from 11.8% to 6.0% and increases average medical care expenditure 77% among the newly insured. JEL Classification: C61, D81, G22, I12, I1

    Critical thinking for 21st-century education: A cyber-tooth curriculum?

    Get PDF
    It is often assumed that the advent of digital technologies requires fundamental change to the curriculum and to the teaching and learning approaches used in schools around the world to educate this generation of “digital natives” or the “net generation”. This article analyses the concepts of 21st-century skills and critical thinking, to understand how these aspects of learning might contribute to a 21st-century education. The author argues that, although both critical thinking and 21st-century skills are indeed necessary in a curriculum for a 21st-century education, they are not sufficient, even in combination. The role of knowledge and an understanding of differing cultural perspectives and values indicate that education should also fit local contexts in a global world and meet the specific needs of students in diverse cultures. It should also fit the particular technical and historical demands of the 21st century in relation to digital skills
    corecore