117 research outputs found

    Non-stationary extreme models and a climatic application

    Get PDF
    In this paper, we study extreme values of non-stationary climatic phenomena. In the usually considered stationary case, the modelling of extremes is only based on the behaviour of the tails of the distribution of the remainder of the data set. In the non-stationary case though, it seems reasonable to assume that the temporal dynamics of the entire data set and that of extremes are closely related and thus all the available information about this link should be used in statistical studies of these events. We try to study how centered and normalized data which are closer to stationary data than the observation allows easier statistical analysis and to understand if we are very far from a hypothesis stating that the extreme events of centered and normed data follow a stationary distribution. The location and scale parameters used for this transformation (the central field), as well as extreme parameters obtained for the transformed data enable us to retrieve the trends in extreme events of the initial data set. Through non-parametric statistical methods, we thus compare a model directly built on the extreme events and a model reconstructed from estimations of the trends of the location and scale parameters of the entire data set and stationary extremes obtained from the centered and normed data set. In case of a correct reconstruction, we can clearly state that variations of the characteristics of extremes are well explained by the central field. Through these analyses we bring arguments to choose constant shape parameters of extreme distributions. We show that for the frequency of the moments of high threshold excesses (or for the mean of annual extremes), the general dynamics explains a large part of the trends on frequency of extreme events. The conclusion is less obvious for the amplitudes of threshold exceedances (or the variance of annual extremes) – especially for cold temperatures, partly justified by the statistical tools used, which require further analyses on the variability definition

    Is there a trend in extremely high river temperature for the next decades? A case study for France

    No full text
    International audienceAfter 2003's summer heat wave, Electricité de France created a global plan called "heat wave-dryness". In this context, the present study tries to estimate high river temperatures for the next decades, taking into account climatic and anthropogenic evolutions. To do it, a specific methodology based on Extreme Value Theory (EVT) is applied. In particular, a trend analysis of water temperature data is done and included in EVT used. The studied river temperatures consist of mean daily temperatures for 27 years measured near the French power plants (between 1977 and 2003), with four series for the Rhône river, four for the Loire river and a few for other rivers. There are also three series of mean daily temperatures computed by a numerical model. For each series, we have applied statistical extreme value modelling. Because of thermal inertia, the Generalized Extreme Value (GEV) distribution is corrected by the medium cluster length, which represents thermal inertia of water during extremely hot events. The µ and s parameters of the GEV distributions are taken as polynomial or continuous piecewise linear functions of time. The best functions for µ and s parameters are chosen using Akaike criterion based on likelihood and some physical checking. For all series, the trend is positive for µ and not significant for s, over the last 27 years. However, we cannot assign this evolution only to the climatic change for the Rhône river because the river temperature is the resultant of several causes: hydraulic or atmospheric, natural or related to the human activity. For the other rivers, the trend for µ could be assigned to the climatic change more clearly. Furthermore, the sample is too short to provide reliable return levels estimations for return periods exceeding thirty years. Still, quantitative return levels could be compared with physical models for example

    Statistical properties of Pauli matrices going through noisy channels

    No full text
    International audienceWe study the statistical properties of the triplet (σx,σy,σz)(\sigma_x,\sigma_y,\sigma_z) of Pauli matrices going through a sequence of noisy channels, modeled by the repetition of a general, trace-preserving, completely positive map. We show a non-commutative central limit theorem for the distribution of this triplet, which shows up a 3-dimensional Brownian motion in the limit with a non-trivial covariance matrix. We also prove a large deviation principle associated to this convergence, with an explicit rate function depending on the stationary state of the noisy channel

    Time series aggregation, disaggregation and long memory

    Get PDF
    We study the aggregation/disaggregation problem of random parameter AR(1) processes and its relation to the long memory phenomenon. We give a characterization of a subclass of aggregated processes which can be obtained from simpler, "elementary", cases. In particular cases of the mixture densities, the structure (moving average representation) of the aggregated process is investigated

    Langevin diffusions on the torus: estimation and applications

    Get PDF
    We introduce stochastic models for continuous-time evolution of angles and develop their estimation. We focus on studying Langevin diffusions with stationary distributions equal to well-known distributions from directional statistics, since such diffusions can be regarded as toroidal analogues of the Ornstein–Uhlenbeck process. Their likelihood function is a product of transition densities with no analytical expression, but that can be calculated by solving the Fokker–Planck equation numerically through adequate schemes. We propose three approximate likelihoods that are computationally tractable: (i) a likelihood based on the stationary distribution; (ii) toroidal adaptations of the Euler and Shoji–Ozaki pseudo-likelihoods; (iii) a likelihood based on a specific approximation to the transition density of the wrapped normal process. A simulation study compares, in dimensions one and two, the approximate transition densities to the exact ones, and investigates the empirical performance of the approximate likelihoods. Finally, two diffusions are used to model the evolution of the backbone angles of the protein G (PDB identifier 1GB1) during a molecular dynamics simulation. The software package sdetorus implements the estimation methods and applications presented in the paper

    Sous-espaces de L

    No full text
    • …
    corecore