84,146 research outputs found

    The Distributional Effects of Devolution in the U.S. Welfare Reform

    Get PDF
    The passage of the PRWORA in 1996 devolved responsibility for the design of welfare programs from the federal to state governments in the U.S. Some of the strategies implemented to achieve the main goals of the reform –promoting higher levels of labor participation and decreasing levels of welfare dependency– might have had the effects of reducing the protection received by the most vulnerable households and increasing differences in benefit levels across states. We estimate these effects using TANF data covering the two decades after the PRWORA\u27s enactment. We measure the contribution of each state to inequality in adequacy rates. We provide an interpretation of the decomposition of the change in inequality in adequacy rates in terms of progressivity and re-ranking components, and we analyze the convergence in TANF adequacy rates. We also estimate the conditional convergence of adequacy ratios with respect to the change in labor participation, poverty rates, and caseloads. We find that differences in adequacy rates increased and that a downward divergence path took place ensuing devolution of welfare reform in the U.S

    Integrating testing techniques through process programming

    Get PDF
    Integration of multiple testing techniques is required to demonstrate high quality of software. Technique integration has three basic goals: incremental testing capabilities, extensive error detection, and cost-effective application. We are experimenting with the use of process programming as a mechanism of integrating testing techniques. Having set out to integrate DATA FLOW testing and RELAY, we proposed synergistic use of these techniques to achieve all three goals. We developed a testing process program much as we would develop a software product from requirements through design to implementation and evaluation. We found process programming to be effective for explicitly integrating the techniques and achieving the desired synergism. Used in this way, process programming also mitigates many of the other problems that plague testing in the software development process

    Branch-coverage testability transformation for unstructured programs

    Get PDF
    Test data generation by hand is a tedious, expensive and error-prone activity, yet testing is a vital part of the development process. Several techniques have been proposed to automate the generation of test data, but all of these are hindered by the presence of unstructured control flow. This paper addresses the problem using testability transformation. Testability transformation does not preserve the traditional meaning of the program, rather it deals with preserving test-adequate sets of input data. This requires new equivalence relations which, in turn, entail novel proof obligations. The paper illustrates this using the branch coverage adequacy criterion and develops a branch adequacy equivalence relation and a testability transformation for restructuring. It then presents a proof that the transformation preserves branch adequacy

    An empirical investigation into branch coverage for C programs using CUTE and AUSTIN

    Get PDF
    Automated test data generation has remained a topic of considerable interest for several decades because it lies at the heart of attempts to automate the process of Software Testing. This paper reports the results of an empirical study using the dynamic symbolic-execution tool. CUTE, and a search based tool, AUSTIN on five non-trivial open source applications. The aim is to provide practitioners with an assessment of what can be achieved by existing techniques with little or no specialist knowledge and to provide researchers with baseline data against which to measure subsequent work. To achieve this, each tool is applied 'as is', with neither additional tuning nor supporting harnesses and with no adjustments applied to the subject programs under test. The mere fact that these tools can be applied 'out of the box' in this manner reflects the growing maturity of Automated test data generation. However, as might be expected, the study reveals opportunities for improvement and suggests ways to hybridize these two approaches that have hitherto been developed entirely independently. (C) 2010 Elsevier Inc. All rights reserved

    A Symbolic Execution Algorithm for Constraint-Based Testing of Database Programs

    Full text link
    In so-called constraint-based testing, symbolic execution is a common technique used as a part of the process to generate test data for imperative programs. Databases are ubiquitous in software and testing of programs manipulating databases is thus essential to enhance the reliability of software. This work proposes and evaluates experimentally a symbolic ex- ecution algorithm for constraint-based testing of database programs. First, we describe SimpleDB, a formal language which offers a minimal and well-defined syntax and seman- tics, to model common interaction scenarios between pro- grams and databases. Secondly, we detail the proposed al- gorithm for symbolic execution of SimpleDB models. This algorithm considers a SimpleDB program as a sequence of operations over a set of relational variables, modeling both the database tables and the program variables. By inte- grating this relational model of the program with classical static symbolic execution, the algorithm can generate a set of path constraints for any finite path to test in the control- flow graph of the program. Solutions of these constraints are test inputs for the program, including an initial content for the database. When the program is executed with respect to these inputs, it is guaranteed to follow the path with re- spect to which the constraints were generated. Finally, the algorithm is evaluated experimentally using representative SimpleDB models.Comment: 12 pages - preliminary wor

    The development of a program analysis environment for Ada

    Get PDF
    A unit level, Ada software module testing system, called Query Utility Environment for Software Testing of Ada (QUEST/Ada), is described. The project calls for the design and development of a prototype system. QUEST/Ada design began with a definition of the overall system structure and a description of component dependencies. The project team was divided into three groups to resolve the preliminary designs of the parser/scanner: the test data generator, and the test coverage analyzer. The Phase 1 report is a working document from which the system documentation will evolve. It provides history, a guide to report sections, a literature review, the definition of the system structure and high level interfaces, descriptions of the prototype scope, the three major components, and the plan for the remainder of the project. The appendices include specifications, statistics, two papers derived from the current research, a preliminary users' manual, and the proposal and work plan for Phase 2

    Adequacy of Economic Resources in Retirement and Returns-toscale in Consumption

    Get PDF
    Most assessments of the adequacy of retirement resources are expressed as a comparison of preretirement income to immediate post-retirement income. Yet, among couples a substantial fraction of retirement years is eventually spent by the surviving spouse living alone. To the extent that singles need less than couples to maintain the same standard of living, assessments of the adequacy of economic resources that make no adjustment for widowing will ystematically misstate economic preparation. We estimate returns-to-scale parameters in spending by older households, using data from the Consumption and Activities Mail Survey and apply these to assessments of adequacy of retirement resources.

    Pension systems and reform conceptual framework

    Get PDF
    The WorldBank's conceptual framework to assess pension systems and reform options evaluates initial conditions and the capacity to improve the enabling environment, and then focuses on how best to work within these to achieve the core objectives of pension systems - protection against the risk of poverty in old age and smoothing consumption from one's work life into retirement. The Bank applies a multi-pillared approach towards pension system modalities to address the needs of target populations including: (i) a non-contributory'zero pillar'extending some level of old-age income security to all of the elderly; (ii) an appropriately sized mandatory'first pillar'with the objective of replacing some portion of lifetime pre-retirement income through contributions linked to earnings; (iii) a funded mandatory defined-contribution'second pillar'that typically provides privately-managed individual savings accounts; (iv) a funded voluntary'third-pillar;'and (v) a non-financial'fourth pillar.'The primary evaluation criteria are the ability of the system to maintain adequacy, affordability, sustainability, equity, predictability, and robustness. The secondary evaluation criteria are the system's capacity to: minimize labor market distortions; contribute to savings mobilization; and contribute to financial market development. Because pension benefits are claims against future economic output, it is essential that over time pension systems contribute to growth and output to support the promised benefits. Going forward, the Bank is focusing on strengthening its support in: (a) establishing a clearer results framework to assess pension systems and reforms; (b) enhancing knowledge management, including research and learning; and (c) improving implementation capacity.Emerging Markets,Debt Markets,Environmental Economics&Policies,Pensions&Retirement Systems,Banks&Banking Reform
    corecore