209 research outputs found

    Understanding and Controlling Regime Switching in Molecular Diffusion

    Full text link
    Diffusion can be strongly affected by ballistic flights (long jumps) as well as long-lived sticking trajectories (long sticks). Using statistical inference techniques in the spirit of Granger causality, we investigate the appearance of long jumps and sticks in molecular-dynamics simulations of diffusion in a prototype system, a benzene molecule on a graphite substrate. We find that specific fluctuations in certain, but not all, internal degrees of freedom of the molecule can be linked to either long jumps or sticks. Furthermore, by changing the prevalence of these predictors with an outside influence, the diffusion of the molecule can be controlled. The approach presented in this proof of concept study is very generic, and can be applied to larger and more complex molecules. Additionally, the predictor variables can be chosen in a general way so as to be accessible in experiments, making the method feasible for control of diffusion in applications. Our results also demonstrate that data-mining techniques can be used to investigate the phase-space structure of high-dimensional nonlinear dynamical systems.Comment: accepted for publication by PR

    How does the quality of a prediction depend on the magnitude of the events under study?

    Get PDF
    We investigate the predictability of extreme events in time series. The focus of this work is to understand, under which circumstances large events are better predictable than smaller events. Therefore we use a simple prediction algorithm based on precursory structures which are identified via the maximum likelihood principle. Using theses precursory structures we predict threshold crossings in autocorrelated processes of order one, which are either Gaussian, exponentially or Pareto distributed. The receiver operating characteristic curve is used as a measure for the quality of predictions we find that the dependence on the event magnitude is closely linked to the probability distribution function of the underlying stochastic process. We evaluate this dependence on the probability distribution function numerically and in the Gaussian case also analytically. Furthermore, we study predictions of threshold crossings in correlated data, i.e., velocity increments of a free jet flow. The velocity increments in the free jet flow are in dependence on the time scale either asymptotically Gaussian or asymptotically exponential distributed. If we assume that the optimal precursory structures are used to make the predictions, we find that large threshold crossings are for all different types of distributions better predictable. These results are in contrast to previous results, obtained for the prediction of large increments, which showed a strong dependence on the probability distribution function of the underlying process

    Precursors of extreme increments

    Get PDF
    We investigate precursors and predictability of extreme increments in a time series. The events we are focusing on consist in large increments within successive time steps. We are especially interested in understanding how the quality of the predictions depends on the strategy to choose precursors, on the size of the event and on the correlation strength. We study the prediction of extreme increments analytically in an AR(1) process, and numerically in wind speed recordings and long-range correlated ARMA data. We evaluate the success of predictions via receiver operator characteristics (ROC-curves). Furthermore, we observe an increase of the quality of predictions with increasing event size and with decreasing correlation in all examples. Both effects can be understood by using the likelihood ratio as a summary index for smooth ROC-curves

    Budget institutions and taxation

    Get PDF
    While a number of different studies have explored the effects of budgetary procedures and the centralization of the budget process on government debt, deficits and spending, few of them have explored whether such fiscal institutions matter for public revenue. This article argues that centralizing the budget process raises the levels of taxation by limiting the ability of individual government officials to veto tax increases in line with common-pool-problem arguments regarding public finances. Using detailed data on budgetary procedures from 15 EU countries, the empirical analysis shows that greater centralization of the budget process increases taxation as a share of GDP and that both the type of budget centralization and level of government fractionalization matter for the size of this effect. The results suggest that further centralizing the budget process limits government debt and deficits by increasing public revenues as well as constraining public spending

    Numerical convergence of the block-maxima approach to the Generalized Extreme Value distribution

    Full text link
    In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems. In this setting, recent works have shown how to get a statistics of extremes in agreement with the classical Extreme Value Theory. We pursue these investigations by giving analytical expressions of Extreme Value distribution parameters for maps that have an absolutely continuous invariant measure. We compare these analytical results with numerical experiments in which we study the convergence to limiting distributions using the so called block-maxima approach, pointing out in which cases we obtain robust estimation of parameters. In regular maps for which mixing properties do not hold, we show that the fitting procedure to the classical Extreme Value Distribution fails, as expected. However, we obtain an empirical distribution that can be explained starting from a different observable function for which Nicolis et al. [2006] have found analytical results.Comment: 34 pages, 7 figures; Journal of Statistical Physics 201

    The role of large-scale spatial patterns in the chaotic amplification of perturbations in a Lorenz’96 model

    Get PDF
    The preparation of perturbed initial conditions to initialize an ensemble of numerical weather forecasts is a crucial task in current ensemble prediction systems (EPSs). Perturbations are added in the places where they are expected to grow faster, in order to provide an envelope of uncertainty along with the deterministic forecast. This work analyses the influence of large-scale spatial patterns on the growth of small perturbations. Therefore, we compare Lyapunov vector (LV) definitions, used in the initialization of state-of-the-art EPSs, with the so-called characteristic LVs. We test the dynamical behaviour of these LVs in the two-scale Lorenz’96 system. We find that the commonly used definitions of LVs include non-intrinsic and spurious effects due to their mutual orthogonality. We also find that the spatial locations where the small-scale perturbations are growing are ‘quantized’ by the large-scale pattern. This ‘quantization’ enhances the artificial disposition of the LVs, which is only avoided using the characteristic LVs, an unambiguous basis which may also be of great use in realistic models for assessing or initializing EPSs
    corecore