785 research outputs found

    General Properties of Rational Stock-Market Fluctuations

    Get PDF
    Which pricing kernel restrictions are needed to make low dimensional Markov models consistent with given sets of predictions on aggregate stock-market fluctuations? This paper develops theoretical test conditions addressing this and related reverse engineering issues arising within a fairly general class of long-lived asset pricing models. These conditions solely affect the first primitives of the economy (probabilistic descriptions of the world, information structures, and preferences). They thus remove some of the arbitrariness related to the specification of theoretical models involving unobserved variables, state-dependent preferences, and incomplete markets.Pricing kernel restrictions, Convexity, Equilibrium volatility

    Fundamental Properties of Bond Prices in Models of the Short-Term Rate

    Get PDF
    This paper develops restrictions that arbitrage-constrained bond prices impose on the short-term rate process in order to be consistent with given dynamic properties of the term-structure of interest rates. The central focus is the relationship between bond prices and the short-term rate volatility. In both scalar and multidimensional diffusion settings, typical relationships between bond prices and volatility are generated by joint restrictions on the risk-neutralized drift functions of the state variables and convexity of bond prices with respect to the short-term rate. The theory is illustrated by several examples and is partially extended to accommodate the occurrence of jumps and default.

    Repeated moral hazard and recursive Lagrangeans

    Get PDF
    This paper shows how to solve dynamic agency models by extending recursive Lagrangean techniques à la Marcet and Marimon (2011) to problems with hidden actions. The method has many advantages with respect to promised utilities approach (Abreu, Pearce and Stacchetti (1990)): it is a significant improvement in terms of simplicity, tractability and computational speed. Solutions can be easily computed for hidden actions models with several endogenous state variables and several agents, while the promised utilities approach becomes extremely difficult and computationally intensive even with just one state variable or two agents. Several numerical examples illustrate how this methodology outperforms the standard approach.repeated moral hazard; collocation method; dynamic models with private information; recursive contracts

    Recovering the Probability Density Function of Asset Prices Using GARCH as Diffusion Approximations

    Get PDF
    This paper uses Garch models to estimate the objective and risk-neutral density functions of financial asset prices and, by comparing their shapes, recover detailed information on economic agents' attitudes toward risk. It differs from recent papers investigating analogous issues because it uses Nelson's (1990) result that Garch schemes are approximations of the kind of differential equations typically employed in finance to describe the evolution of asset prices. This feature of Garch schemes usually has been overshadowed by their well-known role as simple econometric tools providing reliable estimates of unobserved conditional variances. We show instead that the diffusion approximation property of Garch gives good results and can be extended to situations with i) non-standard distributions for the innovations of a conditional mean equation of asset price changes and ii) volatility concepts different from the variance. The objective PDF of the asset price is recovered from the estimation of a nonlinear Garch fitted to the historical path of the asset price. The risk-neutral PDF is extracted from crosssections of bond option prices, after introducing a volatility risk premium function. The direct comparison of the shapes of the two PDFS reveals the price attached by economic agents to the different states of nature. Applications are carried out with regard to the futures written on the Italian 10-year bond.option pricing, stochastic volatility, ARCH, volatility risk premium

    A Simple Approach to the Estimation of Continuous Time CEV Stochastic Volatility Models of the Short-Term Rate

    Get PDF
    Aim of this article is to judge the empirical performance of 'ARCH models as diffusion approximations' of models of the short-term rate with stochastic volatility. Our estimation strategy is based both on moment conditions needed to guarantee the convergence of the discrete time models and on the quasi indirect inference principle. Unlike previous literature in which standard ARCH models approximate only specific diffusion models (those in which the variance of volatility is proportional to the square of volatility), our estimation strategy relies on ARCH models that approximate any CEV-diffusion model for volatility. A MonteCarlo study reveals that the filtering performances of these models are remarkably good, even in the presence of important misspecification. Finally, based on a natural substitute of a global specification test for just-identified problems designed within indirect inference methods, we provide strong empirical evidence that approximating diffusions with our models gives rise to a disaggregation bias that is not significant.stochastic volatility, CEV-ARCH, indirect inference, yield curve

    Introduction to Haar Measure Tools in Quantum Information: A Beginner's Tutorial

    Full text link
    The Haar measure plays a vital role in quantum information, but its study often requires a deep understanding of representation theory, posing a challenge for beginners. This tutorial aims to provide a basic introduction to Haar measure tools in quantum information, utilizing only basic knowledge of linear algebra and thus aiming to make this topic more accessible. The tutorial begins by introducing the Haar measure with a specific emphasis on characterizing the moment operator, an essential element for computing integrals over the Haar measure. It also covers properties of the symmetric subspace and introduces helpful tools like Tensor network diagrammatic notation, which aid in visualizing and simplifying calculations. Next, the tutorial explores the concept of unitary designs, providing equivalent definitions, and subsequently explores approximate notions of unitary designs, shedding light on the relationships between these different notions. Practical examples of Haar measure calculations are illustrated, including the derivation of well-known formulas such as the twirling of a quantum channel. Lastly, the tutorial showcases the applications of Haar measure calculations in Quantum Machine Learning and Classical Shadow tomography.Comment: 44 pages, comments are welcom

    Fermiophobic Higgs boson and supersymmetry

    Full text link
    If a light Higgs boson with mass 125 GeV is fermiophobic, or partially fermiophobic, then the MSSM is excluded. The minimal supersymmetric fermiophobic Higgs scenario can naturally be formulated in the context of the NMSSM that admits Z_3 discrete symmetries. In the fermiophobic NMSSM, the SUSY naturalness criteria are relaxed by a factor N_c y_t^4/g^4 \sim 25, removing the little hierarchy problem and allowing sparticle masses to be naturally of order 2--3 TeV. This scale motivates wino or higgsino dark matter. The SUSY flavour and CP problems as well as the constraints on sparticle and Higgs boson masses from b \to s\gamma, B_s \to \mu\mu\ and direct LHC searches are relaxed in fermiophobic NMSSM. The price to pay is that a new, yet unknown, mechanism must be introduced to generate fermion masses. We show that in the fermiophobic NMSSM the radiative Higgs boson branchings to \gamma\gamma, \gamma Z can be modified compared to the fermiophobic and ordinary standard model predictions, and fit present collider data better. Suppression of dark matter scattering off nuclei explains the absence of signal in XENON100.Comment: added discussion on the general tan\beta\ case, same as published version, 26 pages, 6 figure

    Uncertainty, Information Acquisition, and Price Swings in Asset Markets

    Get PDF
    This article analyses costly information acquisition in asset markets with Knightian uncertainty about the asset fundamentals. In these markets, acquiring information not only reduces the expected variability of the fundamentals for a given distribution (i.e. risk). It also mitigates the uncertainty about the true distribution of the fundamentals. Agents who lack knowledge of this distribution cannot correctly interpret the information other investors impound into the price. We show that, due to uncertainty aversion, the incentives to reduce uncertainty by acquiring information increase as more investors acquire information. When uncertainty is high enough, information acquisition decisions become strategic complements and lead to multiple equilibria. Swift changes in information demand can drive large price swings even after small changes in Knightian uncertaint

    Velocity in the long run : money and structural transformation

    Get PDF
    Monetary velocity declines as economies grow. We argue that this is due to the process of structural transformation - the shift of workers from agricultural to non-agricultural production associated with rising income. A calibrated, two-sector model of structural transformation with monetary and non-monetary trade accurately generates the long run monetary velocity of the US between 1869 and 2013 as well as the velocity of a panel of 92 countries between 1980 and 2010. Three lessons arise from our analysis: 1) Developments in agriculture, rather than non-agriculture, are key in driving monetary velocity; 2) Inflationary policies are disproportionately more costly in richer than in poorer countries; and 3) Nominal prices and inflation rates are not 'always and everywhere a monetary phenomenon': the composition of output influences money demand and hence the secular trends of price levels.Publisher PD

    Large scale seismic vulnerability and risk evaluation of a masonry churches sample in the historical centre of Naples

    Get PDF
    This paper investigates about the seismic vulnerability and risk of fifteen masonry churches located in the historical centre of Naples. The used analysis method is derived from a procedure already implemented by the University of Basilicata on the churches of Matera. In order to evaluate for the study area the seismic vulnerability and hazard indexes of selected churches, the use of appropriate technical survey forms is done. Data obtained from applying the employed procedure allow for both plotting of vulnerability maps and providing seismic risk indicators of all churches. The comparison among the indexes achieved allows for the evaluation of the health state of inspected churches so to program a priority scale in performing future retrofitting interventions
    corecore