20,657 research outputs found

    Local geometry of random geodesics on negatively curved surfaces

    Get PDF
    It is shown that the tessellation of a compact, negatively curved surface induced by a typical long geodesic segment, when properly scaled, looks locally like a Poisson line process. This implies that the global statistics of the tessellation -- for instance, the fraction of triangles -- approach those of the limiting Poisson line process.Comment: This version extends the results of the previous version to surfaces with possibly variable negative curvatur

    Rates of convergence for extremes of geometric random variables and marked point processes

    Get PDF
    We use the Stein-Chen method to study the extremal behaviour of the problem of extremes for univariate and bivariate geometric laws. We obtain a rate for the convergence to the Gumbel distribution of the law of the maximum of i. i. d. geometric random variables, and show that convergence is faster when approximating by a discretised Gumbel. We similarly find a rate of convergence for the law of maxima of bivariate Marshall-Olkin geometric random pairs when approximating by a discrete limit law. We introduce marked point processes of exceedances (MPPEs), both with univariate and bivariate Marshall-Olkin geometric variables as marks and we determine bounds on the error of the approximation, in an appropriate probability metric, of the law of the MPPE by that of a Poisson process with same mean measure. We then approximate by another Poisson process with an easier-to-use mean measure and estimate the error of this additional approximation. This work contains and extends results contained in the second author's PhD thesis (available at arXiv:1310.2564) under the supervision of Andrew D. Barbour.Comment: 33 pages, 4 figures. Improvements in the bounds of Thm. 4.9 and in the presentation. Minor typos correcte

    Non-Stationary Random Process for Large-Scale Failure and Recovery of Power Distributions

    Full text link
    A key objective of the smart grid is to improve reliability of utility services to end users. This requires strengthening resilience of distribution networks that lie at the edge of the grid. However, distribution networks are exposed to external disturbances such as hurricanes and snow storms where electricity service to customers is disrupted repeatedly. External disturbances cause large-scale power failures that are neither well-understood, nor formulated rigorously, nor studied systematically. This work studies resilience of power distribution networks to large-scale disturbances in three aspects. First, a non-stationary random process is derived to characterize an entire life cycle of large-scale failure and recovery. Second, resilience is defined based on the non-stationary random process. Close form analytical expressions are derived under specific large-scale failure scenarios. Third, the non-stationary model and the resilience metric are applied to a real life example of large-scale disruptions due to Hurricane Ike. Real data on large-scale failures from an operational network is used to learn time-varying model parameters and resilience metrics.Comment: 11 pages, 8 figures, submitted to IEEE Sig. Pro

    The Random Walk of High Frequency Trading

    Full text link
    This paper builds a model of high-frequency equity returns by separately modeling the dynamics of trade-time returns and trade arrivals. Our main contributions are threefold. First, we characterize the distributional behavior of high-frequency asset returns both in ordinary clock time and in trade time. We show that when controlling for pre-scheduled market news events, trade-time returns of the highly liquid near-month E-mini S&P 500 futures contract are well characterized by a Gaussian distribution at very fine time scales. Second, we develop a structured and parsimonious model of clock-time returns by subordinating a trade-time Gaussian distribution with a trade arrival process that is associated with a modified Markov-Switching Multifractal Duration (MSMD) model. This model provides an excellent characterization of high-frequency inter-trade durations. Over-dispersion in this distribution of inter-trade durations leads to leptokurtosis and volatility clustering in clock-time returns, even when trade-time returns are Gaussian. Finally, we use our model to extrapolate the empirical relationship between trade rate and volatility in an effort to understand conditions of market failure. Our model suggests that the 1,200 km physical separation of financial markets in Chicago and New York/New Jersey provides a natural ceiling on systemic volatility and may contribute to market stability during periods of extremely heavy trading

    From Fault Tree to Credit Risk Assessment: A Case Study

    Get PDF
    Reliability has been largely applied to industrial systems in order to study the various possibilities of systems’ failure. The goal is to establish the chain of events leading to any system’s failure, namely the top event. Looking for the minimal paths leading to any system’s fault allows for a better control of systems’ safety. To this end, reliability is composed of a static approach (see Ngom et al. [1999] for example) as well as a dynamic approach (see Reory & Andrews [2003] for example). In this paper, we extend the framework stated by Gatfaoui (2003) allowing for the application of fault tree theory to credit risk assessment. The author explains that fault tree is one alternative approach of reliability, which matches default risk analysis in a simple framework. Our extension includes other distributions of probability to model the lifetimes of French firms while studying the related empirical default probabilities. We use mainly, but not exclusively, continuous distributions for which the exponential law used by Gatfaoui (2003) constitutes a particular case. Our results exhibit both the exponential nature of French .rms. lifetimes as well as strong convex and fast decreasing time varying failure rates. Such a feature has some non- negligible impact insofar as it characterizes corresponding credit spreads’ Term structure.credit risk, default probability, failure rate, fault tree, reliability, survival probability

    From Fault Tree to Credit Risk Assessment: A Case Study

    Get PDF
    Reliability has been largely applied to industrial systems in order to study the various possibilities of systems’ failure. The goal is to establish the chain of events leading to any system’s failure, namely the top event. Looking for the minimal paths leading to any system’s fault allows for a better control of systems’ safety. To this end, reliability is composed of a static approach as well as a dynamic approach. In this paper, we extend the canonical framework allowing for the application of fault tree theory to credit risk assessment. The author explains that fault tree is one alternative approach of reliability, which matches default risk analysis in a simple framework. Our extension includes other distributions of probability to model the lifetimes of French firms while studying the related empirical default probabilities. We use mainly, but not exclusively, continuous distributions. Our results exhibit both the exponential nature of French .rms. lifetimes as well as strong convex and fast decreasing time varying failure rates. Such a feature has some non-negligible impact insofar as it characterizes corresponding credit spreads’ Term structure.Credit risk, default probability, failure rate, fault tree, reliability, survival

    Analysis of time-to-event for observational studies: Guidance to the use of intensity models

    Full text link
    This paper provides guidance for researchers with some mathematical background on the conduct of time-to-event analysis in observational studies based on intensity (hazard) models. Discussions of basic concepts like time axis, event definition and censoring are given. Hazard models are introduced, with special emphasis on the Cox proportional hazards regression model. We provide check lists that may be useful both when fitting the model and assessing its goodness of fit and when interpreting the results. Special attention is paid to how to avoid problems with immortal time bias by introducing time-dependent covariates. We discuss prediction based on hazard models and difficulties when attempting to draw proper causal conclusions from such models. Finally, we present a series of examples where the methods and check lists are exemplified. Computational details and implementation using the freely available R software are documented in Supplementary Material. The paper was prepared as part of the STRATOS initiative.Comment: 28 pages, 12 figures. For associated Supplementary material, see http://publicifsv.sund.ku.dk/~pka/STRATOSTG8

    Directed force chain networks and stress response in static granular materials

    Full text link
    A theory of stress fields in two-dimensional granular materials based on directed force chain networks is presented. A general equation for the densities of force chains in different directions is proposed and a complete solution is obtained for a special case in which chains lie along a discrete set of directions. The analysis and results demonstrate the necessity of including nonlinear terms in the equation. A line of nontrivial fixed point solutions is shown to govern the properties of large systems. In the vicinity of a generic fixed point, the response to a localized load shows a crossover from a single, centered peak at intermediate depths to two propagating peaks at large depths that broaden diffusively.Comment: 18 pages, 12 figures. Minor corrections to one figur
    • …
    corecore