4,720 research outputs found

    A New Relative Skill Measure for Games with Chance Elements

    Get PDF
    An interesting aspect of games is the relative extent to which a player can positively influence his results by making appropriate strategic choices. This question is closely related to the issue of how to distinguish between games of skill and games of chance. The distinction between these two types of games is definitely interesting from a juridical point of view. Borm and Van der Genugten (2001) presented a method to measure the skill level of a game. In principle, their measure can serve as a juridical tool for the classification of games with respect to skill. In this paper we present a modification of the measure. The main difference is that this new definition does not automatically classify incomplete information games without chance moves as games of skill. We use a coin game and a simplified version of standard drawpoker as an illustration.games of skill;games of chance

    Exact transmission moments in one-dimensional weak localization and single-parameter scaling

    Full text link
    We obtain for the first time the expressions for the mean and the variance of the transmission coefficient for an Anderson chain in the weak localization regime, using exact expansions of the complex transmission- and reflection coefficients to fourth order in the weakly disordered site energies. These results confirm the validity of single-parameter scaling theory in a domain where the higher transmission cumulants may be neglected. We compare our results with earlier results for transmission cumulants in the weak localization domain based on the phase randomization hypothesis

    Agnostic notes on regression adjustments to experimental data: Reexamining Freedman's critique

    Full text link
    Freedman [Adv. in Appl. Math. 40 (2008) 180-193; Ann. Appl. Stat. 2 (2008) 176-196] critiqued ordinary least squares regression adjustment of estimated treatment effects in randomized experiments, using Neyman's model for randomization inference. Contrary to conventional wisdom, he argued that adjustment can lead to worsened asymptotic precision, invalid measures of precision, and small-sample bias. This paper shows that in sufficiently large samples, those problems are either minor or easily fixed. OLS adjustment cannot hurt asymptotic precision when a full set of treatment-covariate interactions is included. Asymptotically valid confidence intervals can be constructed with the Huber-White sandwich standard error estimator. Checks on the asymptotic approximations are illustrated with data from Angrist, Lang, and Oreopoulos's [Am. Econ. J.: Appl. Econ. 1:1 (2009) 136--163] evaluation of strategies to improve college students' achievement. The strongest reasons to support Freedman's preference for unadjusted estimates are transparency and the dangers of specification search.Comment: Published in at http://dx.doi.org/10.1214/12-AOAS583 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On Integration Methods Based on Scrambled Nets of Arbitrary Size

    Full text link
    We consider the problem of evaluating I(φ):=∫[0,1)sφ(x)dxI(\varphi):=\int_{[0,1)^s}\varphi(x) dx for a function φ∈L2[0,1)s\varphi \in L^2[0,1)^{s}. In situations where I(φ)I(\varphi) can be approximated by an estimate of the form N−1∑n=0N−1φ(xn)N^{-1}\sum_{n=0}^{N-1}\varphi(x^n), with {xn}n=0N−1\{x^n\}_{n=0}^{N-1} a point set in [0,1)s[0,1)^s, it is now well known that the OP(N−1/2)O_P(N^{-1/2}) Monte Carlo convergence rate can be improved by taking for {xn}n=0N−1\{x^n\}_{n=0}^{N-1} the first N=λbmN=\lambda b^m points, λ∈{1,…,b−1}\lambda\in\{1,\dots,b-1\}, of a scrambled (t,s)(t,s)-sequence in base b≥2b\geq 2. In this paper we derive a bound for the variance of scrambled net quadrature rules which is of order o(N−1)o(N^{-1}) without any restriction on NN. As a corollary, this bound allows us to provide simple conditions to get, for any pattern of NN, an integration error of size oP(N−1/2)o_P(N^{-1/2}) for functions that depend on the quadrature size NN. Notably, we establish that sequential quasi-Monte Carlo (M. Gerber and N. Chopin, 2015, \emph{J. R. Statist. Soc. B, to appear.}) reaches the oP(N−1/2)o_P(N^{-1/2}) convergence rate for any values of NN. In a numerical study, we show that for scrambled net quadrature rules we can relax the constraint on NN without any loss of efficiency when the integrand φ\varphi is a discontinuous function while, for sequential quasi-Monte Carlo, taking N=λbmN=\lambda b^m may only provide moderate gains.Comment: 27 pages, 2 figures (final version, to appear in The Journal of Complexity

    Alternate Samplingmethods for Estimating Multivariate Normal Probabilities

    Get PDF
    We study the performance of alternative sampling methods for estimating multivariate normal probabilities through the GHK simulator. The sampling methods are randomized versions of some quasi-Monte Carlo samples (Halton, Niederreiter, Niederreiter-Xing sequences and lattice points) and some samples based on orthogonal arrays (Latin hypercube, orthogonal array and orthogonal array based Latin hypercube samples). In general, these samples turn out to have a better performance than Monte Carlo and antithetic Monte Carlo samples. Improvements over these are large for low-dimensional (4 and 10) cases and still significant for dimensions as large as 50

    Generating ambiguity in the laboratory

    Get PDF
    This article develops a method for drawing samples from which it is impossible to infer any quantile or moment of the underlying distribution. The method provides researchers with a way to give subjects the experience of ambiguity. In any experiment, learning the distribution from experience is impossible for the subjects, essentially because it is impossible for the experimenter. We describe our method mathematically, illustrate it in simulations, and then test it in a laboratory experiment. Our technique does not withhold sampling information, does not assume that the subject is incapable of making statistical inferences, is replicable across experiments, and requires no special apparatus. We compare our method to the techniques used in related experiments that attempt to produce an ambiguous experience for the subjects.ambiguity; Ellsberg; Knightian uncertainty; laboratory experiments; ignorance; vagueness JEL Classications: C90; C91; C92; D80; D81
    • …
    corecore