8,291 research outputs found

    Likelihood decision functions

    Get PDF
    In both classical and Bayesian approaches, statistical inference is unified and generalized by the corresponding decision theory. This is not the case for the likelihood approach to statistical inference, in spite of the manifest success of the likelihood methods in statistics. The goal of the present work is to fill this gap, by extending the likelihood approach in order to cover decision making as well. The resulting decision functions, called likelihood decision functions, generalize the usual likelihood methods (such as ML estimators and LR tests), in the sense that these methods appear as the likelihood decision functions in particular decision problems. In general, the likelihood decision functions maintain some key properties of the usual likelihood methods, such as equivariance and asymptotic optimality. By unifying and generalizing the likelihood approach to statistical inference, the present work offers a new perspective on statistical methodology and on the connections among likelihood methods

    On maxitive integration

    Get PDF
    The Shilkret integral is maxitive (i.e., the integral of a pointwise supremum of functions is the supremum of their integrals), but defined only for nonnegative functions. In the present paper, some properties of this integral (such as subadditivity and a law of iterated expectations) are studied, in comparison with the additive and Choquet integrals. Furthermore, the definition of a maxitive integral for all real functions is discussed. In particular, a convex, maxitive integral is introduced and some of its properties are derived

    On maxitive integration

    Get PDF
    A functional is said to be maxitive if it commutes with the (pointwise) supremum operation. Such functionals find application in particular in decision theory and related fields. In the present paper, maxitive functionals are characterized as integrals with respect to maxitive measures (also known as possibility measures or idempotent measures). These maxitive integrals are then compared with the usual additive and nonadditive integrals on the basis of some important properties, such as convexity, subadditivity, and the law of iterated expectations

    Transverse fracture properties of green wood and the anatomy of six temperate tree species

    Get PDF
    © Institute of Chartered Foresters, 2016. All rights reserved. The aim of this study was to investigate the effect of wood anatomy and density on the mechanics of fracture when wood is split in the radial-longitudinal (RL) and tangential-longitudinal (TL) fracture systems. The specific fracture energies (Gf, J m-2) of the trunk wood of six tree species were studied in the green state using double-edge notched tensile tests. The fracture surfaces were examined in both systems using Environmental Scanning Electron Microscopy (ESEM). Wood density and ray characteristics were also measured. The results showed that Gf in RL was greater than TL for five of the six species. In particular, the greatest degree of anisotropy was observed in Quercus robur L., and the lowest in Larix decidua Mill. ESEM micrographs of fractured specimens suggested reasons for the anisotropy and differences across tree species. In the RL system, fractures broke across rays, the walls of which unwound like tracheids in longitudinal-tangential (LT) and longitudinal-radial (LR) failure, producing a rough fracture surface which would absorb energy, whereas in the TL system, fractures often ran alongside rays

    Robust regression with imprecise data

    Get PDF
    We consider the problem of regression analysis with imprecise data. By imprecise data we mean imprecise observations of precise quantities in the form of sets of values. In this paper, we explore a recently introduced likelihood-based approach to regression with such data. The approach is very general, since it covers all kinds of imprecise data (i.e. not only intervals) and it is not restricted to linear regression. Its result consists of a set of functions, reflecting the entire uncertainty of the regression problem. Here we study in particular a robust special case of the likelihood-based imprecise regression, which can be interpreted as a generalization of the method of least median of squares. Moreover, we apply it to data from a social survey, and compare it with other approaches to regression with imprecise data. It turns out that the likelihood-based approach is the most generally applicable one and is the only approach accounting for multiple sources of uncertainty at the same time

    On the implementation of LIR: the case of simple linear regression with interval data

    Get PDF
    This paper considers the problem of simple linear regression with interval-censored data. That is, n pairs of intervals are observed instead of the n pairs of precise values for the two variables (dependent and independent). Each of these intervals is closed but possibly unbounded, and contains the corresponding (unobserved) value of the dependent or independent variable. The goal of the regression is to describe the relationship between (the precise values of) these two variables by means of a linear function. Likelihood-based Imprecise Regression (LIR) is a recently introduced, very general approach to regression for imprecisely observed quantities. The result of a LIR analysis is in general set-valued: it consists of all regression functions that cannot be excluded on the basis of likelihood inference. These regression functions are said to be undominated. Since the interval data can be unbounded, a robust regression method is necessary. Hence, we consider the robust LIR method based on the minimization of the residuals' quantiles. For this method, we prove that the set of all the intercept-slope pairs corresponding to the undominated regression functions is the union of finitely many polygons. We give an exact algorithm for determining this set (i.e., for determining the set-valued result of the robust LIR analysis), and show that it has worst-case time complexity O(n^3 log n). We have implemented this exact algorithm as part of the R package linLIR

    Likelihood-based Imprecise Regression

    Get PDF
    We introduce a new approach to regression with imprecisely observed data, combining likelihood inference with ideas from imprecise probability theory, and thereby taking different kinds of uncertainty into account. The approach is very general and applicable to various kinds of imprecise data, not only to intervals. In the present paper, we propose a regression method based on this approach, where no parametric distributional assumption is needed and interval estimates of quantiles of the error distribution are used to identify plausible descriptions of the relationship of interest. Therefore, the proposed regression method is very robust. We apply our robust regression method to an interesting question in the social sciences. The analysis, based on survey data, yields a relatively imprecise result, reflecting the high amount of uncertainty inherent in the analyzed data set

    The alpha-effect in rotating convection: a comparison of numerical simulations

    Full text link
    Numerical simulations are an important tool in furthering our understanding of turbulent dynamo action, a process that occurs in a vast range of astrophysical bodies. It is important in all computational work that comparisons are made between different codes and, if non-trivial differences arise, that these are explained. Kapyla et al (2010: MNRAS 402, 1458) describe an attempt to reproduce the results of Hughes & Proctor (2009: PRL 102, 044501) and, by employing a different methodology, they arrive at very different conclusions concerning the mean electromotive force and the generation of large-scale fields. Here we describe why the simulations of Kapyla et al (2010) are simply not suitable for a meaningful comparison, since they solve different equations, at different parameter values and with different boundary conditions. Furthermore we describe why the interpretation of Kapyla et al (2010) of the calculation of the alpha-effect is inappropriate and argue that the generation of large-scale magnetic fields by turbulent convection remains a problematic issue.Comment: Submitted to MNRAS. 5 pages, 3 figure

    Classical BV theories on manifolds with boundary

    Full text link
    In this paper we extend the classical BV framework to gauge theories on spacetime manifolds with boundary. In particular, we connect the BV construction in the bulk with the BFV construction on the boundary and we develop its extension to strata of higher codimension in the case of manifolds with corners. We present several examples including electrodynamics, Yang-Mills theory and topological field theories coming from the AKSZ construction, in particular, the Chern-Simons theory, the BFBF theory, and the Poisson sigma model. This paper is the first step towards developing the perturbative quantization of such theories on manifolds with boundary in a way consistent with gluing.Comment: The second version has many typos corrected, references added. Some typos are probably still there, in particular, signs in examples. In the third version more typoes are corrected and the exposition is slightly change

    Positive operator valued measures covariant with respect to an irreducible representation

    Full text link
    Given an irreducible representation of a group G, we show that all the covariant positive operator valued measures based on G/Z, where Z is a central subgroup, are described by trace class, trace one positive operators.Comment: 9 pages, Latex2
    • …
    corecore