46 research outputs found

    Consumer Demand under Price Uncertainty: Empirical Evidence from the Market for Cigarettes

    Get PDF
    The goal of this paper is to analyze consumer demand in markets with large price uncertainty. We develop a demand model for goods that are subject to habit formation. We show that consumption plans of forward looking individuals depend not only on preferences and current period prices, but also on individual beliefs about the evolution of future prices. Moreover, a mean preserving spread in the price distribution and, hence, an increase in price uncertainty reduces consumption along the optimal path. With smoking as our application, we test the predictions of our model. We use a unique data set of prices for cigarettes collected by the Bureau of Labor Statistics to characterize price uncertainty and price expectations of individuals. We have also obtained access to the restricted use version of the National Education Longitudinal Study, which provides detailed information on smoking behavior of teenagers in the U.S. Our estimation results suggest that teenagers who live in metropolitan areas with a large amount of cigarette price volatility have, on average, significantly lower levels of cigarette consumption. Moreover, these individuals are less likely to start consuming cigarettes. Our results also provide evidence that young individuals are forward looking. Myopic individuals would not respond to an increase in uncertainty about future prices by reducing consumption.

    Breaking the Curse of Dimensionality

    Get PDF
    This paper proposes a new nonparametric estimator for general regression functions with multiple regressors. The method used here is motivated by a remarkable result derived by Kolmogorov (1957) and later tightened by Lorentz (1966). In short, any continuous function f(x_1,...,x_d) has the representation G[a_1 P_1(x_1) + ... + a_d P_1(x_d)] + ... + G[a_1 P_m(x_1) + ... + a_d P_m(x_d)], m = 2d+1, where G(.) is a continuous function, P_k(.), k=1,...,2d+1, is Lipschitz of order one and strictly increasing, and a_j, j=1,...,d, is some constant. Generalizing this result, we propose the following estimator, g_1[a_1,1 p_1(x_1) + ... + a_d,1 p_1(x_d)] + ... + g_m[a_1,d P_m(x_1) + ... + a_d,d p_m(x_d)], where both g_k(.) and p_k(.) are twice continuously differentiable. These functions are estimated using regression cubic B-splines, which have excellent numerical properties. This problem has been previously intractable because there existed no method for imposing monotonicity on the p_k(.)'s, a priori, such that the estimator is dense in the set of all monotonic cubic B-splines. We derive a method that only requires 2(r+1)+1 restrictions, where r is the number of interior knots. Rates of convergence in L_2 are the same as the optimal rate for the one-dimensional case. A simulation experiment shows that the estimator works well when optimization is performed by using the back-fitting algorithm. The monotonic restriction has many other applications besides the one presented here, such as estimating a demand function. With only r+2 more constraints, it is also possible to impose concavity.

    EXPERIMENTATION AND LEARNING IN RATIONAL ADDICTION MODELS WITH MULTIPLE ADDICTIVE GOODS

    Get PDF
    The purpose of this paper is to explore and evaluate smooth approximation methods for value functions. These approximation methods are increasingly important in numerical dynamic programming since they allow researchers to solve models with a multitude of continuous state variables. In this paper we focus on a new approximation method which has been recently developed in the context of semi-nonparametric estimation by Coppejans (1999). The basic idea of this approach is to represent a function of several variables as superpositions of functions of one variable. The one-dimensional functions as well as the superpositions are represented as B-splines which have nice computational properties. This approach has two distinct advantages. First, it allows us to impose useful properties on the value function such as monotonicity and concavity. Second, and more importantly, it allows us to parameterize the value function by a fairly low dimensional object which alleviates the curse of dimensionality typically encountered in these type of problems. In order to evaluate this new method we compare it with more commonly used methods like Chebychev Polynomials. The comparison of the two methods is based on dynamic model of rational addiction under uncertainty. Orphanides Zervos (1995) argue that uncertainty and learning through experimentation need to be incorporated into the rational addiction framework in order to account for `involuntary'' addiction. We extend their simple model to allow for wealth accumulation as well as uncertainty in income and asset returns. This gives rise to rich dynamic model with five continuous state variables and hence provides a good model to test the two approximation algorithms of interest.

    Radio and X-ray observations of the luminous fast blue optical transient AT 2020xnd

    Get PDF
    We present deep X-ray and radio observations of the fast blue optical transient (FBOT) AT 2020xnd/ZTF 20acigmel at z = 0.2433 from 13 days to 269 days after explosion. AT 2020xnd belongs to the category of optically luminous FBOTs with similarities to the archetypal event AT 2018cow. AT 2020xnd shows luminous radio emission reaching L ν ≈ 8 × 1029 erg s−1 Hz−1 at 20 GHz and 75 days post-explosion, accompanied by luminous and rapidly fading soft X-ray emission peaking at L X ≈ 6 × 1042 erg s−1. Interpreting the radio emission in the context of synchrotron radiation from the explosion’s shock interaction with the environment, we find that AT 2020xnd launched a high-velocity outflow (v ∼ 0.1c–0.2c) propagating into a dense circumstellar medium (effective Ṁ≈10−3M⊙ yr−1 for an assumed wind velocity of v w = 1000 km s−1). Similar to AT 2018cow, the detected X-ray emission is in excess compared to the extrapolated synchrotron spectrum and constitutes a different emission component, possibly powered by accretion onto a newly formed black hole or neutron star. These properties make AT 2020xnd a high-redshift analog to AT 2018cow, and establish AT 2020xnd as the fourth member of the class of optically luminous FBOTs with luminous multiwavelength counterparts

    ngVLA Key Science Goal 5 Understanding the Formation and Evolution of Black Holes in the Era of Multi-Messenger Astronomy

    Get PDF
    The next-generation Very Large Array (ngVLA) will be a powerful telescope for finding and studying black holes across the entire mass range. High-resolution imaging abilities will allow the separation of low-luminosity black holes in the local Universe from background sources, thereby providing critical constraints on the mass function, formation, and growth of black holes. Its combination of sensitivity and angular resolution will provide new constraints on the physics of black hole accretion and jet formation. Combined with facilities across the spectrum and gravitational wave observatories, the ngVLA will provide crucial constraints on the interaction of black holes with their environments, with specific implications for the relationship between evolution of galaxies and the emission of gravitational waves from in-spiraling supermassive black holes and potential implications for stellar mass and intermediate mass black holes. The ngVLA will identify the radio counterparts to transient sources discovered by electromagnetic, gravitational wave, and neutrino observatories, and its high-resolution, fast-mapping capabilities will make it the preferred instrument to pinpoint electromagnetic counterparts to events such as supermassive black hole mergers. The National Radio Astronomy Observatory is a facility of the National Science Foundation operated under cooperative agreement by Associated Universities, Inc. Part of this research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration

    Effective Nonparametric Estimation in the Case of Severely Discretized Data

    No full text
    Almost all economic data sets are discretized or rounded to some extent. This paper proposes a regression and a density estimator that work especially well when the data are very discrete. The estimators are a weighted average of the data, and the weights are composed of cubic B-splines. Unlike most nonparametric settings, where it is assumed that the observed data come from a continuum of possibilities, we base our work on the assumption that the discreteness becomes finer as the sample size increases. Rates of convergence and asymptotic distributional results are derived under this condition

    Effective Nonparametric Estimation in the Case of Severely Discretized Data

    No full text
    Often economic data are discretized or rounded to some extent. This paper proposes a regression and a density estimator that work especially well when discretization causes conventional kernel based estimators to behave poorly. The estimator proposed here is a weighted average of neighboring frequency estimators, and the weights are composed of cubic B-splines. Interestingly, we show that this estimator can have both a smaller bias and variance than frequency estimators. As a means to obtain asymptotic normality and rates of convergence, we assume that the discreteness becomes finer as the sample size increases
    corecore