17,040 research outputs found

    Measurement in Economics and Social Science

    Get PDF
    The paper discusses measurement, primarily in economics, from both analytical and historical perspectives. The historical section traces the commitment to ordinalism on the part of economic theorists from the doctrinal disputes between classical economics and marginalism, through the struggle of orthodox economics against socialism down to the cold-war alliance between mathematical social science and anti-communist ideology. In economics the commitment to ordinalism led to the separation of theory from the quantitative measures that are computed in practice: price and quantity indexes, consumer surplus and real national product. The commitment to ordinality entered political science, via Arrow’s ‘impossibility theorem’, effectively merging it with economics, and ensuring its sterility. How can a field that has as its central result the impossibility of democracy contribute to the design of democratic institutions? The analytical part of the paper deals with the quantitative measures mentioned above. I begin with the conceptual clarification that what these measures try to achieve is a restoration of the money metric that is lost when prices are variable. I conclude that there is only one measure that can be embedded in a satisfactory economic theory, free from unreasonable restrictions. It is the Törnqvist index as an approximation to its theoretical counterpart the Divisia index. The statistical agencies have at various times produced different measures for real national product and its components, as well as related concepts. I argue that all of these are flawed and that a single deflator should be used for the aggregate and the components. Ideally this should be a chained Törnqvist price index defined on aggregate consumption. The social sciences are split. The economic approach is abstract, focused on the assumption of rational and informed behavior, and tends to the political right. The sociological approach is empirical, stresses the non-rational aspects of human behavior and tends to the political left. I argue that the split is due to the fact that the empirical and theoretical traditions were never joined in the social sciences as they were in the natural sciences. I also argue that measurement can potentially help in healing this split

    Geometric factors in the Bohr--Rosenfeld analysis of the measurability of the electromagnetic field

    Full text link
    The Geometric factors in the field commutators and spring constants of the measurement devices in the famous analysis of the measurability of the electromagnetic field by Bohr and Rosenfeld are calculated using a Fourier--Bessel method for the evaluation of folding integrals, which enables one to obtain the general geometric factors as a Fourier--Bessel series. When the space region over which the factors are defined are spherical, the Fourier--Bessel series terms are given by elementary functions, and using the standard Fourier-integral method of calculating folding integrals, the geometric factors can be evaluated in terms of manageable closed-form expressions.Comment: 21 pages, REVTe

    Uncertainty quantification of coal seam gas production prediction using Polynomial Chaos

    Full text link
    A surrogate model approximates a computationally expensive solver. Polynomial Chaos is a method to construct surrogate models by summing combinations of carefully chosen polynomials. The polynomials are chosen to respect the probability distributions of the uncertain input variables (parameters); this allows for both uncertainty quantification and global sensitivity analysis. In this paper we apply these techniques to a commercial solver for the estimation of peak gas rate and cumulative gas extraction from a coal seam gas well. The polynomial expansion is shown to honour the underlying geophysics with low error when compared to a much more complex and computationally slower commercial solver. We make use of advanced numerical integration techniques to achieve this accuracy using relatively small amounts of training data

    Parallelization of adaptive MC Integrators

    Get PDF
    Monte Carlo (MC) methods for numerical integration seem to be embarassingly parallel on first sight. When adaptive schemes are applied in order to enhance convergence however, the seemingly most natural way of replicating the whole job on each processor can potentially ruin the adaptive behaviour. Using the popular VEGAS-Algorithm as an example an economic method of semi-micro parallelization with variable grain-size is presented and contrasted with another straightforward approach of macro-parallelization. A portable implementation of this semi-micro parallelization is used in the xloops-project and is made publicly available.Comment: 10 pages, LaTeX2e, 1 pstricks-figure included and 2 eps-figures inserted via epsfig. To appear in Comput. Phys. Commu

    Using Simulation-Based Inference with Panel Data in Health Economics

    Get PDF
    Panel datasets provide a rich source of information for health economists, offering the scope to control for individual heterogeneity and to model the dynamics of individual behaviour. However the qualitative or categorical measures of outcome often used in health economics create special problems for estimating econometric models. Allowing a flexible specification of individual heterogeneity leads to models involving higher order integrals that cannot be handled by conventional numerical methods. The dramatic growth in computing power over recent years has been accompanied by the development of simulation estimators that solve this problem. This review uses binary choice models to show what can be done with conventional methods and how the range of models can be expanded by using simulation methods. Practical applications of the methods are illustrated using on health from the British Household Panel Survey (BHPS)Econometrics, panel data, simulation methods, determinants of health

    Using Simulation-based Inference with Panel Data in Health Economics

    Get PDF
    Panel datasets provide a rich source of information for health economists, offering the scope to control for individual heterogeneity and to model the dynamics of individual behaviour. However the qualitative or categorical measures of outcome often used in health economics create special problems for estimating econometric models. Allowing a flexible specification of the autocorrelation induced by individual heterogeneity leads to models involving higher order integrals that cannot be handled by conventional numerical methods. The dramatic growth in computing power over recent years has been accompanied by the development of simulation-based estimators that solve this problem. This review uses binary choice models to show what can be done with conventional methods and how the range of models can be expanded by using simulation methods. Practical applications of the methods are illustrated using data on health from the British Household Panel Survey (BHPS).

    Techniques for computing two-loop QCD corrections to b-->c transitions

    Get PDF
    We have recently presented the complete O(alpha_s^2) corrections to the semileptonic decay width of the b quark at maximal recoil. Here we discuss various technical aspects of that calculation and further applications of similar methods. In particular, we describe an expansion which facilitates the phase space integrations and the treatment of the mixed real-virtual corrections, for which Taylor expansion does not work and the so-called eikonal expansion must be employed. Several terms of the expansion are given for the O(alpha_s^2) QCD corrections to the differential semileptonic decay width of the b --quark at maximal recoil. We also demonstrate how the light quark loop corrections to the top quark decay rate can be obtained using the same methods. We briefly discuss the application of these techniques to the calculation of the O(alpha_s^2) correction to zero recoil sum rules for heavy flavor transitions.Comment: 22 pages, revte

    On hyperlogarithms and Feynman integrals with divergences and many scales

    Get PDF
    It was observed that hyperlogarithms provide a tool to carry out Feynman integrals. So far, this method has been applied successfully to finite single-scale processes. However, it can be employed in more general situations. We give examples of integrations of three- and four-point integrals in Schwinger parameters with non-trivial kinematic dependence, involving setups with off-shell external momenta and differently massive internal propagators. The full set of Feynman graphs admissible to parametric integration is not yet understood and we discuss some counterexamples to the crucial property of linear reducibility. Furthermore we clarify how divergent integrals can be approached in dimensional regularization with this algorithm.Comment: 26 pages, 11 figures, 2 tables, explicit results in ancillary file "results" and on http://www.math.hu-berlin.de/~panzer/ (version as in JHEP; link corrected
    • …
    corecore