78 research outputs found
Unbounded Utility for Savage's "Foundations of Statistics," and Other Models
A general procedure for extending finite-dimensional "additive-like" representations for binary relations to infinite-dimensional "integral-like" representations is developed by means of a condition called truncation-continuity. The restriction of boundedness of utility, met throughout the literature, can now be dispensed with, and for instance normal distributions, or any other distribution with finite first moment, can be incorporated. Classical representation results of expected utility, such as Savage (1954), von Neumann and Morgenstern (1944), Anscombe and Aumann (1963), de Finetti (1937), and many others, can now be extended. The results are generalized to Schmeidler's (1989) approach with nonadditive measures and Choquet integrals, and Quiggin's (1982) rank-dependent utility. The different approaches have been brought together in this paper to bring to the fore the unity in the extension process
Inference on a Distribution from Noisy Draws
We consider a situation where the distribution of a random variable is being
estimated by the empirical distribution of noisy measurements of that variable.
This is common practice in, for example, teacher value-added models and other
fixed-effect models for panel data. We use an asymptotic embedding where the
noise shrinks with the sample size to calculate the leading bias in the
empirical distribution arising from the presence of noise. The leading bias in
the empirical quantile function is equally obtained. These calculations are new
in the literature, where only results on smooth functionals such as the mean
and variance have been derived. Given a closed-form expression for the bias,
bias-corrected estimator of the distribution function and quantile function can
be constructed. We provide both analytical and jackknife corrections that
recenter the limit distribution and yield confidence intervals with correct
coverage in large samples. These corrections are non-parametric and easy to
implement. Our approach can be connected to corrections for selection bias and
shrinkage estimation and is to be contrasted with deconvolution. Simulation
results confirm the much-improved sampling behavior of the corrected
estimators.Comment: 24 pages main text, 22 pages appendix (including references
First-Principles Models for van der Waals Interactions in Molecules and Materials: Concepts, Theory, and Applications
Noncovalent van der Waals (vdW) or dispersion forces are ubiquitous in
nature and influence the structure, stability, dynamics, and function of molecules and
materials throughout chemistry, biology, physics, and materials science. These forces are
quantum mechanical in origin and arise from electrostatic interactions between
fluctuations in the electronic charge density. Here, we explore the conceptual and
mathematical ingredients required for an exact treatment of vdW interactions, and
present a systematic and unified framework for classifying the current first-principles
vdW methods based on the adiabatic-connection fluctuation−dissipation (ACFD)
theorem (namely the Rutgers−Chalmers vdW-DF, Vydrov−Van Voorhis (VV),
exchange-hole dipole moment (XDM), Tkatchenko−Scheffler (TS), many-body
dispersion (MBD), and random-phase approximation (RPA) approaches). Particular
attention is paid to the intriguing nature of many-body vdW interactions, whose
fundamental relevance has recently been highlighted in several landmark experiments.
The performance of these models in predicting binding energetics as well as structural,
electronic, and thermodynamic properties is connected with the theoretical concepts and provides a numerical summary of the
state-of-the-art in the field. We conclude with a roadmap of the conceptual, methodological, practical, and numerical challenges
that remain in obtaining a universally applicable and truly predictive vdW method for realistic molecular systems and materials
Nonparametric estimation of homothetic and homothetically separable functions
For vectors x and w, let r(x,w) be a function that can be nonparametrically estimated consistently and asymptotically normally. We provide consistent, asymptotically normal estimators for the functions g and h, where r(x,w) = h[g(x),w], g is linearly homogeneous and h is monotonic in g. This framework encompasses homothetic and homothetically separable functions. Such models reduce the curse of dimensionality, provide a natural generalization of linear index models, and are widely used in utility, production, and cost function applications. Extensions to related functional forms include a generalized partly linear model with unknown link function. We provide simulation evidence on the small sample performance of our estimator, and we apply our method to a Chinese production dataset.
Nonparametric Estimation of Homothetic and Homothetically Separable Functions
For vectors x and w, let r(x,w) be a function that can be nonparametrically estimated consistently and asymptotically normally. We provide consistent, asymptotically normal estimators for the functions g and h, where r(x,w) = h[g(x), w], g is linearly homogeneous and h is monotonic in g. This framework encompasses homothetic and homothetically separable functions. Such models reduce the curse of dimensionality, provide a natural generalization of linear index models, and are widely used in utility, production, and cost function applications. Extensions to related functional forms include a generalized partly linear model with unknown link function. We provide simulation evidence on the small sample performance of our estimator, and we apply our method to a Chinese production dataset.Cost function, economic scale, homogeneous function, homothetic function, index models, nonparametric, production function, separability.
Reinforcement Learning Methods for Conic Finance
Conic Finance is a world of two-prices, a more grounded reality than the theory of one-price. The world, however, is constructed by considering nonadditive expectations of risks or value functions. This makes some of the optimization algorithms incompatible with this universe, if not infeasible. It is more evident in the application of Reinforcement Learning algorithms where the underlying principle of TD learning and Bellman equations are based on the additivity of value functions. Hence, the task undertaken here is to mold the recent advances in the field of Distributional Reinforcement Learning to be conducive to learning in the setting of nonadditive dynamics. Algorithms for discrete and continuous actions are described and illustrated on sample problems in finance
The von Neumann/Morgenstern approach to ambiguity
Dumav M, Stinchcombe MB. The von Neumann/Morgenstern approach to ambiguity. Center for Mathematical Economics Working Papers. Vol 480. Bielefeld: Center for Mathematical Economics; 2013.A choice problem is risky (respectively ambiguous) if the decision maker is choosing between probability distributions (respectively sets of probability distributions) over utility relevant consequences. We provide an axiomatic foundation for and a representation of continuous linear preferences over sets of probabilities on consequences. The representation theory delivers: first and second order dominance for ambiguous problems; a utility interval based dominance relation that distinguishes between sources of uncertainty; a complete theory of updating convex sets of priors; a Bayesian theory of the value of ambiguous information structures; complete separations of attitudes toward risk and ambiguity; and new classes of preferences that allow decreasing relative ambiguity aversion and thereby rationalize recent challenges to many of the extant multiple prior models of ambiguity aversion. We also characterize a property of sets of priors, descriptive completeness, that resolves several open problems and allows multiple prior models to model as large a class of problems as the continuous linear preferences presented here
- …