2,485 research outputs found

    How Often to Sample a Continuous-Time Process in the Presence of Market Microstructure Noise

    Get PDF
    Classical statistics suggest that for inference purposes one should always use as much data as is available. We study how the presence of market microstructure noise in high-frequency financial data can change that result. We show that the optimal sampling frequency at which to estimate the parameters of a discretely sampled continuous-time model can be finite when the observations are contaminated by market microstructure effects. We then address the question of what to do about the presence of the noise. We show that modelling the noise term explicitly restores the first order statistical effect that sampling as often as possible is optimal. But, more surprisingly, we also demonstrate that this is true even if one misspecifies the assumed distribution of the noise term. Not only is it still optimal to sample as often as possible, but the estimator has the same variance as if the noise distribution had been correctly specified, implying that attempts to incorporate the noise into the analysis cannot do more harm than good. Finally, we study the same questions when the observations are sampled at random time intervals, which are an essential feature of transaction-level data.

    The radiation pattern of a QCD antenna in a dense medium

    Get PDF
    We calculate the radiation spectrum off a qq-bar pair of a fixed opening angle theta_qq-bar traversing a medium of length L. Multiple interactions with the medium are handled in the harmonic oscillator approximation, valid for soft gluon emissions. We discuss the time-scales relevant to the decoherence of correlated partons traversing the medium and demonstrate how this relates to the hard scale that govern medium-induced radiation. For large angle radiation, the hard scale is given by Qhard = max(r_perp^{-1}, Qs), where r_perp = theta_qq-bar L is the probed transverse size and Qs is the maximal transverse momentum accumulated by the emitted gluon in the medium. These situations define in turn two distinct regimes, which we call "dipole" and "decoherence" regimes, respectively, and which are discussed in detail. A feature common to both cases is that coherence of the radiation is restored at large transverse momenta, k_\perp > Qhard.Comment: 44 pages, 8 figure

    Gluon propagation inside a high-energy nucleus

    Full text link
    We show that, in the light-cone gauge, it is possible to derive in a very simple way the solution of the classical Yang-Mills equations for the collision between a nucleus and a proton. One important step of the calculation is the derivation of a formula that describes the propagation of a gluon in the background color field of the nucleus. This allows us to calculate observables in pA collisions in a more straightforward fashion than already proposed. We discuss also the comparison between light-cone gauge and covariant gauge in view of further investigations involving higher order corrections.Comment: 12 pages, 2 figure

    Edgeworth Expansions for Realized Volatility and Related Estimators

    Get PDF
    This paper shows that the asymptotic normal approximation is often insufficiently accurate for volatility estimators based on high frequency data. To remedy this, we compute Edgeworth expansions for such estimators. Unlike the usual expansions, we have found that in order to obtain meaningful terms, one needs to let the size of the noise to go zero asymptotically. The results have application to Cornish-Fisher inversion and bootstrapping.

    Ultra High Frequency Volatility Estimation with Dependent Microstructure Noise

    Get PDF
    We analyze the impact of time series dependence in market microstructure noise on the properties of estimators of the integrated volatility of an asset price based on data sampled at frequencies high enough for that noise to be a dominant consideration. We show that combining two time scales for that purpose will work even when the noise exhibits time series dependence, analyze in that context a refinement of this approach based on multiple time scales, and compare empirically our different estimators to the standard realized volatility.

    Ultra high frequency volatility estimation with dependent microstructure noise

    Get PDF
    We analyze the impact of time series dependence in market microstructure noise on the properties of estimators of the integrated volatility of an asset price based on data sampled at frequencies high enough for that noise to be a dominant consideration. We show that combining two time scales for that purpose will work even when the noise exhibits time series dependence, analyze in that context a refinement of this approach based on multiple time scales, and compare empirically our different estimators to the standard realized volatility. --Market microstructure,Serial dependence,High frequency data,Realized volatility,Subsampling,Two Scales Realized Volatility

    Luxury Goods and the Equity Premium

    Get PDF
    This paper evaluates the equity premium using novel data on the consumption of luxury goods. Specifying household utility as a nonhomothetic function of the consumption of both a luxury good and a basic good, we derive pricing equations and evaluate the risk of holding equity. Household survey and national accounts consumption data overstate the risk aversion necessary to match the observed equity premium because they mostly reflect basic consumption. The risk aversion implied by equity returns and the consumption of luxury goods is more than an order of magnitude less than that implied by national accounts data. For the very rich, the equity premium is much less of a puzzle.

    A Tale of Two Time Scales: Determining Integrated Volatility with Noisy High Frequency Data

    Get PDF
    It is a common practice in finance to estimate volatility from the sum of frequently-sampled squared returns. However market microstructure poses challenges to this estimation approach, as evidenced by recent empirical studies in finance. This work attempts to lay out theoretical grounds that reconcile continuous-time modeling and discrete-time samples. We propose an estimation approach that takes advantage of the rich sources in tick-by-tick data while preserving the continuous-time assumption on the underlying returns. Under our framework, it becomes clear why and where the usual' volatility estimator fails when the returns are sampled at the highest frequency.

    New picture of jet quenching dictated by color coherence

    Full text link
    We propose a new description of the jet quenching phenomenon observed in nuclear collisions at high energies in which coherent parton branching plays a central role. This picture is based on the appearance of a dynamically generated scale, the jet resolution scale, which controls the transverse resolution power of the medium to simultaneously propagating color probes. Since from the point of view of the medium all partonic jet fragments within this transverse distance act coherently as a single emitter, this scale allows us to rearrange the jet shower into effective emitters. We observe that in the kinematic regime of the LHC, the corresponding characteristic angle is comparable to the typical opening angle of high energy jets such that most of the jet energy is contained within a non-resolvable color coherent inner core. Thus, a sizable fraction of the jets are unresolved, losing energy as a single parton without modifications of their intra-jet structure. This new picture provides a consistent understanding of the present data on reconstructed jet observables and constitute the basis for future developments.Comment: 4 pages, 2 figure
    • …
    corecore