798 research outputs found

    Asymptotic Conditional Distribution of Exceedance Counts: Fragility Index with Different Margins

    Full text link
    Let X=(X1,...,Xd)\bm X=(X_1,...,X_d) be a random vector, whose components are not necessarily independent nor are they required to have identical distribution functions F1,...,FdF_1,...,F_d. Denote by NsN_s the number of exceedances among X1,...,XdX_1,...,X_d above a high threshold ss. The fragility index, defined by FI=limsE(NsNs>0)FI=\lim_{s\nearrow}E(N_s\mid N_s>0) if this limit exists, measures the asymptotic stability of the stochastic system X\bm X as the threshold increases. The system is called stable if FI=1FI=1 and fragile otherwise. In this paper we show that the asymptotic conditional distribution of exceedance counts (ACDEC) pk=limsP(Ns=kNs>0)p_k=\lim_{s\nearrow}P(N_s=k\mid N_s>0), 1kd1\le k\le d, exists, if the copula of X\bm X is in the domain of attraction of a multivariate extreme value distribution, and if lims(1Fi(s))/(1Fκ(s))=γi[0,)\lim_{s\nearrow}(1-F_i(s))/(1-F_\kappa(s))=\gamma_i\in[0,\infty) exists for 1id1\le i\le d and some κ1,...,d\kappa\in{1,...,d}. This enables the computation of the FI corresponding to X\bm X and of the extended FI as well as of the asymptotic distribution of the exceedance cluster length also in that case, where the components of X\bm X are not identically distributed

    Bridging the ARCH model for finance and nonextensive entropy

    Full text link
    Engle's ARCH algorithm is a generator of stochastic time series for financial returns (and similar quantities) characterized by a time-dependent variance. It involves a memory parameter bb (b=0b=0 corresponds to {\it no memory}), and the noise is currently chosen to be Gaussian. We assume here a generalized noise, namely qnq_n-Gaussian, characterized by an index qnRq_{n} \in {\cal R} (qn=1q_{n}=1 recovers the Gaussian case, and qn>1q_n>1 corresponds to tailed distributions). We then match the second and fourth momenta of the ARCH return distribution with those associated with the qq-Gaussian distribution obtained through optimization of the entropy S_{q}=\frac{% 1-\sum_{i} {p_i}^q}{q-1}, basis of nonextensive statistical mechanics. The outcome is an {\it analytic} distribution for the returns, where an unique qqnq\ge q_n corresponds to each pair (b,qn)(b,q_n) (q=qnq=q_n if b=0 b=0). This distribution is compared with numerical results and appears to be remarkably precise. This system constitutes a simple, low-dimensional, dynamical mechanism which accommodates well within the current nonextensive framework.Comment: 4 pages, 5 figures.Figure 4 fixe

    A theory for long-memory in supply and demand

    Get PDF
    Recent empirical studies have demonstrated long-memory in the signs of orders to buy or sell in financial markets [2, 19]. We show how this can be caused by delays in market clearing. Under the common practice of order splitting, large orders are broken up into pieces and executed incrementally. If the size of such large orders is power law distributed, this gives rise to power law decaying autocorrelations in the signs of executed orders. More specifically, we show that if the cumulative distribution of large orders of volume v is proportional to v to the power -alpha and the size of executed orders is constant, the autocorrelation of order signs as a function of the lag tau is asymptotically proportional to tau to the power -(alpha - 1). This is a long-memory process when alpha < 2. With a few caveats, this gives a good match to the data. A version of the model also shows long-memory fluctuations in order execution rates, which may be relevant for explaining the long-memory of price diffusion rates.Comment: 12 pages, 7 figure

    Extreme statistics for time series: Distribution of the maximum relative to the initial value

    Full text link
    The extreme statistics of time signals is studied when the maximum is measured from the initial value. In the case of independent, identically distributed (iid) variables, we classify the limiting distribution of the maximum according to the properties of the parent distribution from which the variables are drawn. Then we turn to correlated periodic Gaussian signals with a 1/f^alpha power spectrum and study the distribution of the maximum relative height with respect to the initial height (MRH_I). The exact MRH_I distribution is derived for alpha=0 (iid variables), alpha=2 (random walk), alpha=4 (random acceleration), and alpha=infinity (single sinusoidal mode). For other, intermediate values of alpha, the distribution is determined from simulations. We find that the MRH_I distribution is markedly different from the previously studied distribution of the maximum height relative to the average height for all alpha. The two main distinguishing features of the MRH_I distribution are the much larger weight for small relative heights and the divergence at zero height for alpha>3. We also demonstrate that the boundary conditions affect the shape of the distribution by presenting exact results for some non-periodic boundary conditions. Finally, we show that, for signals arising from time-translationally invariant distributions, the density of near extreme states is the same as the MRH_I distribution. This is used in developing a scaling theory for the threshold singularities of the two distributions.Comment: 29 pages, 4 figure

    Extreme times for volatility processes

    Get PDF
    We present a detailed study on the mean first-passage time of volatility processes. We analyze the theoretical expressions based on the most common stochastic volatility models along with empirical results extracted from daily data of major financial indices. We find in all these data sets a very similar behavior that is far from being that of a simple Wiener process. It seems necessary to include a framework like the one provided by stochastic volatility models with a reverting force driving volatility toward its normal level to take into account memory and clustering effects in volatility dynamics. We also detect in data a very different behavior in the mean first-passage time depending whether the level is higher or lower than the normal level of volatility. For this reason, we discuss asymptotic approximations and confront them to empirical results with a good agreement, specially with the ExpOU model.Comment: 10, 6 colored figure

    Value-at-risk forecasting of the CARBS Indices

    Get PDF
    Abstract: The purpose of this paper is to use calibrated univariate GARCH family models to forecast volatility and value at risk (VaR) of the CARBS indices and a global minimum variance portfolio (GMVP) constructed using the CARBS equity indices. the reliability of the different volatility forecasts are tested using the mean absolute error (MAE) and the mean squared error (MSE). The rolling forecast of VaR is tested using a back-testing procedure. The results indicate that the use of a rolling forecast from a GARCH model when estimating VaR for the CARBS indices and the GMVP is not a reliable method

    Large deviations for a damped telegraph process

    Full text link
    In this paper we consider a slight generalization of the damped telegraph process in Di Crescenzo and Martinucci (2010). We prove a large deviation principle for this process and an asymptotic result for its level crossing probabilities (as the level goes to infinity). Finally we compare our results with the analogous well-known results for the standard telegraph process

    Measuring degree-degree association in networks

    Full text link
    The Pearson correlation coefficient is commonly used for quantifying the global level of degree-degree association in complex networks. Here, we use a probabilistic representation of the underlying network structure for assessing the applicability of different association measures to heavy-tailed degree distributions. Theoretical arguments together with our numerical study indicate that Pearson's coefficient often depends on the size of networks with equal association structure, impeding a systematic comparison of real-world networks. In contrast, Kendall-Gibbons' τb\tau_{b} is a considerably more robust measure of the degree-degree association

    Density of near-extreme events

    Full text link
    We provide a quantitative analysis of the phenomenon of crowding of near-extreme events by computing exactly the density of states (DOS) near the maximum of a set of independent and identically distributed random variables. We show that the mean DOS converges to three different limiting forms depending on whether the tail of the distribution of the random variables decays slower than, faster than, or as a pure exponential function. We argue that some of these results would remain valid even for certain {\em correlated} cases and verify it for power-law correlated stationary Gaussian sequences. Satisfactory agreement is found between the near-maximum crowding in the summer temperature reconstruction data of western Siberia and the theoretical prediction.Comment: 4 pages, 3 figures, revtex4. Minor corrections, references updated. This is slightly extended version of the Published one (Phys. Rev. Lett.
    corecore