17,214 research outputs found
The Falling Factorial Basis and Its Statistical Applications
We study a novel spline-like basis, which we name the "falling factorial
basis", bearing many similarities to the classic truncated power basis. The
advantage of the falling factorial basis is that it enables rapid, linear-time
computations in basis matrix multiplication and basis matrix inversion. The
falling factorial functions are not actually splines, but are close enough to
splines that they provably retain some of the favorable properties of the
latter functions. We examine their application in two problems: trend filtering
over arbitrary input points, and a higher-order variant of the two-sample
Kolmogorov-Smirnov test.Comment: Full version for the ICML paper with the same titl
Brillinger mixing of determinantal point processes and statistical applications
Stationary determinantal point processes are proved to be Brillinger mixing.
This property is an important step towards asymptotic statistics for these
processes. As an important example, a central limit theorem for a wide class of
functionals of determinantal point processes is established. This result yields
in particular the asymptotic normality of the estimator of the intensity of a
stationary determinantal point process and of the kernel estimator of its pair
correlation
Information Compression, Intelligence, Computing, and Mathematics
This paper presents evidence for the idea that much of artificial
intelligence, human perception and cognition, mainstream computing, and
mathematics, may be understood as compression of information via the matching
and unification of patterns. This is the basis for the "SP theory of
intelligence", outlined in the paper and fully described elsewhere. Relevant
evidence may be seen: in empirical support for the SP theory; in some
advantages of information compression (IC) in terms of biology and engineering;
in our use of shorthands and ordinary words in language; in how we merge
successive views of any one thing; in visual recognition; in binocular vision;
in visual adaptation; in how we learn lexical and grammatical structures in
language; and in perceptual constancies. IC via the matching and unification of
patterns may be seen in both computing and mathematics: in IC via equations; in
the matching and unification of names; in the reduction or removal of
redundancy from unary numbers; in the workings of Post's Canonical System and
the transition function in the Universal Turing Machine; in the way computers
retrieve information from memory; in systems like Prolog; and in the
query-by-example technique for information retrieval. The chunking-with-codes
technique for IC may be seen in the use of named functions to avoid repetition
of computer code. The schema-plus-correction technique may be seen in functions
with parameters and in the use of classes in object-oriented programming. And
the run-length coding technique may be seen in multiplication, in division, and
in several other devices in mathematics and computing. The SP theory resolves
the apparent paradox of "decompression by compression". And computing and
cognition as IC is compatible with the uses of redundancy in such things as
backup copies to safeguard data and understanding speech in a noisy
environment
Experimental designs for environmental valuation with choice-experiments: A Monte Carlo investigation
We review the practice of experimental design in the environmental economics literature concerned with choice experiments. We then contrast this with advances in the field of experimental design and present a comparison of statistical efficiency across four different experimental designs evaluated by Monte Carlo experiments. Two different situations are envisaged. First, a correct a priori knowledge of the multinomial logit specification used to derive the design and then an incorrect one. The data generating process is based on estimates from data of a real choice experiment with which preference for rural landscape attributes were studied. Results indicate the D-optimal designs are promising, especially those based on Bayesian algorithms with informative prior. However, if good a priori information is lacking, and if there is strong uncertainty about the real data generating process - conditions which are quite common in environmental valuation - then practitioners might be better off with conventional fractional designs from linear models. Under misspecification, a design of this type produces less biased estimates than its competitors
- CLASSIFYING HIGH TECH NEW VENTURES BY PERFORMANCE: THE MARKET-TECHNOLOGICAL- ENTREPRENEURIAL MATRIX
This is an exploratory insight into the profile and prospects of growth and success attached to one category of firms, known as "New Technology Based Firms (NTBFs), the socalled high-tech and innovative new ventures. With this study we are willing to furnish a new methodological tool instrumental to position any firm characterised by being relatively recent and specialising in high-tech fields or at least, in activities with large scope for innovation. So, we intend to make a methodological contribution to theory in the entrepreneurship field, through an empirical exercise.Analysis of our empirically based data leads us to a new Matrix we call Market-Technology-Entrepreneurial Matrix, whose 8 three-dimensional quadrants serve to classify high- tech new ventures by performance. A Factorial Analysis coupled with a Discriminate Analysis are the statistical tools employed in obtaining the M-T-E Matrix and incorporating predictive capacity to it. El presente estudio, de carácter exploratorio, es una incursión en torno al perfil y perspectivas decrecimiento y éxito, asociados a una categoría de empresas conocidas como New TechnologyBased Firms (NTBFs), las cuales desarrollan actividades altamente innovadoras y habitualmentepertenecientes a sectores de tecnología avanzada. Nuestro propósito radica en desarrollar una nuevaherramienta metodológica que resulte útil para posicionar competitivamente, de manera aproximada,a cualquier compañía con el perfil NTBF: reciente, innovadora e intensiva en tecnologías avanzadas.Tras aplicar la técnica estadística del análisis factorial, hemos obtenido una Matriz que denominamosMatriz Mercado-Tecnología-Emprendedor, cuyos 8 cuadrantes tridimensionales posicionan lascompañías high-tech según sus fortalezas y debilidades y perspectivas de competitividad.Finalmente, mediante el empleo de la técnica estadística del análisis discriminante, hemos podidoincorporar capacidad predictiva a la Matriz.high tech, funcionamiento, matriz. high-tech, performance, matrix.
Quantum Statistical Field Theory and Combinatorics
This is a set of review notes on combinatorial aspects of Bosonic quantum
field theory. We collect together several related issues concerning moments of
distributions, moments of stochastic processes and Ito's formula, and Green's
functions and cumulant moments in quantum field theory.Comment: 50 pages, several figures, extended notes with up-dated reference
- …