51,562 research outputs found

    Empirical Validation of Agent Based Models: A Critical Survey

    Get PDF
    This paper addresses the problem of finding the appropriate method for conducting empirical validation in agent-based (AB) models, which is often regarded as the Achilles’ heel of the AB approach to economic modelling. The paper has two objectives. First, to identify key issues facing AB economists engaged in empirical validation. Second, to critically appraise the extent to which alternative approaches deal with these issues. We identify a first set of issues that are common to both AB and neoclassical modellers and a second set of issues which are specific to AB modellers. This second set of issues is captured in a novel taxonomy, which takes into consideration the nature of the object under study, the goal of the analysis, the nature of the modelling assumptions, and the methodology of the analysis. Having identified the nature and causes of heterogeneity in empirical validation, we examine three important approaches to validation that have been developed in AB economics: indirect calibration, the Werker-Brenner approach, and the history-friendly approach. We also discuss a set of open questions within empirical validation. These include the trade-off between empirical support and tractability of findings, the issue of over-parameterisation, unconditional objects, counterfactuals, and the non-neutrality of data.Empirical validation, agent-based models, calibration, history-friendly modelling

    Study on the feasibility of a tool to measure the macroeconomic impact of structural reforms

    Get PDF
    The main aim of this study is to assess the feasibility of empirical tools to study the impact of structural reforms on the macroeconomic performance in the member countries of the European Union (EU). This report presents the results of the project "Study on the feasibility of a tool to measure the macroeconomic impact of structural reforms" (ECFIN-E/2005/001) and amalgamates the findings of the two previous interim reports and the main conclusions of the workshop held in Brussels on May 11th. The main goal of the project is to determine the most reliable and robust methods to investigate the impacts of economy-wide structural reforms as well as reforms in individual markets or sectors, and to make suggestions as to how they best to implement them and possible improvements of the institutional dataset. In addition, a roadmap has been created which includes the main steps in the model-developing process, and solutions feasible even in the short term are discussed.The most relevant conclusion to be drawn from the study is that the most appropriate tool that can be developed in the short term is the integration of a DSGE model (preferably QUEST due to its in-house availability) with different satellite models, to be developed.structural reforms, product markets, labour markets, financial markets, Dreger, Art�s, Moreno, Ramos, Suri�ach

    A posteriori disclosure risk measure for tabular data based on conditional entropy

    Get PDF
    Statistical database protection, also known as Statistical Disclosure Control (SDC), is a part of information security which tries to prevent published statistical information (tables, individual records) from disclosing the contribution of specific respondents. This paper deals with the assessment of the disclosure risk associated to the release of tabular data. So-called sensitivity rules are currently being used to measure the disclosure risk for tables. These rules operate on an a priori basis: the data are examined and the rules are used to decide whether the data can be released as they stand or should rather be protected. In this paper, we propose to complement a priori risk assessment with a posteriori risk assessment in order to achieve a higher level of security, that is, we propose to take the protected information into account when measuring the disclosure risk. The proposed a posteriori disclosure risk measure is compatible with a broad class of disclosure protection methods and can be extended for computing disclosure risk for a set of linked tables. In the case of linked table protection via cell suppression, the proposed measure allows detection of secondary suppression patterns which offer more protection than others

    Cost-Benefit Analysis and Regulatory Reform: An Assessment of the Science and Art

    Get PDF
    The continuing efforts in the 104th Congress to legislate requirements for cost-benefit analysis (CBA) and the revised Office of Management and Budget guidelines for the conduct of such assessments during a regulatory rulemaking process highlight the need for a comprehensive examination of the role that CBA can play in agency decision-making. This paper summarizes the state of knowledge regarding CBA and offers suggestions for improvement in its use, especially in the context of environmental regulations.

    Assessing the poverty impacts of remittances with alternative counterfactual income estimates

    Get PDF
    We estimate the impacts of remittances on poverty with survey data from Tonga, a poor Pacific island country highly dependent on international migrants’ remittances. The sensitivity of poverty impacts to estimation method is tested using two methods to estimate migrants’ counterfactual incomes; bootstrap prediction with self-selection testing and propensity score matching. We find consistency between the two methods, both showing a substantial reduction in the incidence and depth of poverty with migration and remittances. With further robustness checks there is strong evidence that the poorest households benefit from migrants’ remittances, and that increased migration opportunities can contribute to poverty alleviation.

    Weights in multidimensional indices of well-being: an overview

    Get PDF
    Multidimensional indices are becoming increasingly important instruments to assess the well-being of societies. They move beyond the focus on a single indicator and yet, are easy to present and communicate. A crucial step in the construction of a multidimensional index of well-being is the selection of the relative weights for the different dimensions. The aim of this paper is to study the role of these weights and to critically survey eight different approaches to set them. We categorize the approaches in three classes: data-driven, normative and hybrid weighting, and compare their respective advantages and drawbacks.composite indicator, multidimensional well-being index, weights.

    Thirty years of optimization-based SDC methods for tabular data

    Get PDF
    In 1966 Bacharach published in Management Science a work on matrix rounding problems in two-way tables of economic statistics, formulated as a network optimization problem. This is likely the first application of optimization/operations research for statistical disclosure control (SDC) in tabular data. Years later, in 1982, Cox and Ernst used the same approach in a work in INFOR for a similar problem: controlled rounding. And thirty years ago, in 1992, a paper by Kelly, Golden and Assad appeared in Networks about the solution of the cell suppression problem, also using network optimization. Cell suppression was used for years as the main SDC technique for tabular data, and it was an active field of research which resulted in several lines of work and many publications. The above are some of the seminal works on the use of optimization methods for SDC when releasing tabular data. This paper discusses some of the research done this field since then, with a focus on the approaches that were of practical use. It also discusses their pros and cons compared to recent techniques that are not based on optimization methods.Peer ReviewedPostprint (published version

    Ratings and rankings: Voodoo or Science?

    Full text link
    Composite indicators aggregate a set of variables using weights which are understood to reflect the variables' importance in the index. In this paper we propose to measure the importance of a given variable within existing composite indicators via Karl Pearson's `correlation ratio'; we call this measure `main effect'. Because socio-economic variables are heteroskedastic and correlated, (relative) nominal weights are hardly ever found to match (relative) main effects; we propose to summarize their discrepancy with a divergence measure. We further discuss to what extent the mapping from nominal weights to main effects can be inverted. This analysis is applied to five composite indicators, including the Human Development Index and two popular league tables of university performance. It is found that in many cases the declared importance of single indicators and their main effect are very different, and that the data correlation structure often prevents developers from obtaining the stated importance, even when modifying the nominal weights in the set of nonnegative numbers with unit sum.Comment: 28 pages, 7 figure

    Does Swaziland have a proletariat?

    Get PDF
    corecore