8,439 research outputs found
Representative time use data and new harmonised calibration of the American Heritage Time Use Data (AHTUD) 1965-1999
Representative and reliable individual time use data, in connection with a proper set of socio-economic back-ground variables, are essential elements for the empirical foundation and evaluation of existing and new theories in general and in particular for time use analyses. Within the international project Assessing Time Use Survey Datasets several potentially useful individual US time use heritage datasets have been identified for use in de-veloping an historical series of non-market accounts. In order to evaluate the series of American Heritage Time Use Data (AHTUD) (1965, 1975, 1985, 1992-94, 1998-99) this paper analyses the representativeness of this data when using given weights and provides a new harmonised calibration of the AHTUD for sound time use analyses. Our calibration procedure with its ADJUST program package is theoretically founded on information theory, consistent with a simultaneous weighting including hierarchical data, ensures desired positive weights, and is well-suited and available for any time use data calibration of interest. We present the calibration approach and provide new harmonised weights for all AHTUD surveys based on a substantially driven calibration frame-work. To illustrate the various application possibilities of a calibration, we finally disentangle demographic vs. time use behavioural changes and developments by re-calibrating all five AHTUD surveys using 1965 popula-tion totals as a benchmark.Representative time use data, calibration (adjustment re-weighting) of microdata, information theory, minimum information loss principle, American Heritage Time Use Data (AHTUD), ADJUST program package
Representative Time Use Data and Calibration of the American Time Use Studies 1965-1999
Valid and reliable individual time use data in connection with an appriate set of socio -economic background variables are essential elements of an empirical foundation and evaluation of existing time use theories and for the search of new empirical-based hypotheses about individual behavior. Within the Yale project of Assessing American Heritage Time Use Studies (1965, 1975, 19895, 1992-94 and 1998/99), supported by the Glaser Foundation, and working with these time use studies, it is necessary to be sure about comparable representative data. As it will become evident, there is a serious bias in all of these files concerning demographic characteristics, characteristics which are important for substantive time use research analyses. Our study and new calibration solution will circumvent these biases by delivering a comprehensive demographic adjustment for all incorporated U.S. time use surveys, which is theoretically funded (here by information theory and the minimum information loss principle with its ADJUST program package), is consistent by a simultaneous weighting including hierarchical data, considers substantial requirements for time use research analyses and is similar and thus comparable in the demographic adjustment characteristics for all U.S. time use files to support substantial analyses and allows to disentangle demographic vs. time use behavioral changes and developments.time use, calibration (adjustment re-weighting) of microdata, information theory, minimum information loss principle, American Heritage Time Use Studies, ADJUST program package
Computational Mechanism Design: A Call to Arms
Game theory has developed powerful tools for analyzing decision making in systems with multiple autonomous actors. These tools, when tailored to computational settings, provide a foundation for building multiagent software systems. This tailoring gives rise to the field of computational mechanism design, which applies economic principles to computer systems design
Distributional impacts of carbon taxation and revenue recycling: a behavioural microsimulation. ESRI WP626, June 2019
Carbon taxation is a regressive policy which contributes to public opposition towards same. We employ the Exact
Affine Stone Index demand system to examine the extent to which carbon taxation in Ireland reduces emissions, as well as its
distributional impacts. The Engel curves for various commodity groupings are found to be non-linear, which renders the
particular demand system we have chosen more suitable than other methods found in the extant literature. We find that a
carbon tax increase can decrease emissions, but is indeed regressive. Recycling the revenues to households mitigates these
regressive effects. A targeted allocation that directs the revenues towards less affluent households is found to reduce inequality
more than flat allocation that divides the revenues equally amongst all households; however both methods are capable of
mitigating the regressive effects of the tax increase
Notes on Cloud computing principles
This letter provides a review of fundamental distributed systems and economic
Cloud computing principles. These principles are frequently deployed in their
respective fields, but their inter-dependencies are often neglected. Given that
Cloud Computing first and foremost is a new business model, a new model to sell
computational resources, the understanding of these concepts is facilitated by
treating them in unison. Here, we review some of the most important concepts
and how they relate to each other
Power Load Management as a Computational Market
Power load management enables energy utilities to reduce peak loads and thereby save money. Due to the large number of different loads, power load management is a complicated optimization problem. We present a new decentralized approach to this problem by modeling direct load management as a computational market. Our simulation results demonstrate that our approach is very efficient with a superlinear rate of convergence to equilibrium and an excellent scalability, requiring few iterations even when the number of agents is in the order of one thousand. Aframework for analysis of this and similar problems is given which shows how nonlinear optimization and numerical mathematics can be exploited to characterize, compare, and tailor problem-solving strategies in market-oriented programming
Decentralized Markets versus Central Control: A Comparative Study
Multi-Agent Systems (MAS) promise to offer solutions to problems where
established, older paradigms fall short. In order to validate such claims that
are repeatedly made in software agent publications, empirical in-depth studies
of advantages and weaknesses of multi-agent solutions versus conventional ones
in practical applications are needed. Climate control in large buildings is one
application area where multi-agent systems, and market-oriented programming in
particular, have been reported to be very successful, although central control
solutions are still the standard practice. We have therefore constructed and
implemented a variety of market designs for this problem, as well as different
standard control engineering solutions. This article gives a detailed analysis
and comparison, so as to learn about differences between standard versus agent
approaches, and yielding new insights about benefits and limitations of
computational markets. An important outcome is that "local information plus
market communication produces global control"
Microsimulation - A Survey of Methods and Applications for Analyzing Economic and Social Policy
This essential dimensions of microsimulation as an instrument to analyze and forecast the individual impacts of alternative economic and social policy measures are surveyed in this study. The basic principles of microsimulation, which is a tool for practical policy advising as well as for research and teaching, are pointed out and the static and dynamic (cross-section and life-cycle) approaches are compared to one another. Present and past developments of microsimulation models and their areas of application are reviewed, focusing on the US, Europe and Australia. Based on general requirements and components of microsimulation models a microsimulation model's actual working mechanism are discussed by a concrete example: the concept and realization of MICSIM, a PC microsimulation model based on a relational database system, an offspring of the Sfb 3 Statitic Microsimulation Model. Common issues of microsimulation modeling are regarded: micro/macro link, behavioural response and the important question of evaluating microsimulation results. The concluding remarks accentuate the increasing use of microcomputers for microsimulation models also for teaching purposes.Microsimulation, Microanalytic Simulation Models, Microanalysis, Economic and Social Policy Analysis
Welfarism in economic domains
In economies with public goods, and agents with quasi-linear preferences, we give a characterization of the welfare egalitarian correspondence in terms of three axioms: Pareto optimality, symmetry, and solidarity. This last property requires that an increase in the willingness to pay for the public goods of some of the agents should not decrease the welfare of any of them.Publicad
- …