61 research outputs found
Self-Organized Synchronization and Voltage Stability in Networks of Synchronous Machines
The integration of renewable energy sources in the course of the energy
transition is accompanied by grid decentralization and fluctuating power
feed-in characteristics. This raises new challenges for power system stability
and design. We intend to investigate power system stability from the viewpoint
of self-organized synchronization aspects. In this approach, the power grid is
represented by a network of synchronous machines. We supplement the classical
Kuramoto-like network model, which assumes constant voltages, with dynamical
voltage equations, and thus obtain an extended version, that incorporates the
coupled categories voltage stability and rotor angle synchronization. We
compare disturbance scenarios in small systems simulated on the basis of both
classical and extended model and we discuss resultant implications and possible
applications to complex modern power grids.Comment: 9 pages, 9 figure
Non-parametric estimation of a Langevin model driven by correlated noise
Langevin models are frequently used to model various stochastic processes in
different fields of natural and social sciences. They are adapted to measured
data by estimation techniques such as maximum likelihood estimation, Markov
chain Monte Carlo methods, or the non-parametric direct estimation method
introduced by Friedrich et al. The latter has the distinction of being very
effective in the context of large data sets. Due to their -correlated
noise, standard Langevin models are limited to Markovian dynamics. A
non-Markovian Langevin model can be formulated by introducing a hidden
component that realizes correlated noise. For the estimation of such a
partially observed diffusion a different version of the direct estimation
method was introduced by Lehle et al. However, this procedure includes the
limitation that the correlation length of the noise component is small compared
to that of the measured component. In this work we propose another version of
the direct estimation method that does not include this restriction. Via this
method it is possible to deal with large data sets of a wider range of examples
in an effective way. We discuss the abilities of the proposed procedure using
several synthetic examples
Bayesian on-line anticipation of critical transitions
The design of reliable indicators to anticipate critical transitions in
complex systems is an im portant task in order to detect a coming sudden regime
shift and to take action in order to either prevent it or mitigate its
consequences. We present a data-driven method based on the estimation of a
parameterized nonlinear stochastic differential equation that allows for a
robust anticipation of critical transitions even in the presence of strong
noise levels like they are present in many real world systems. Since the
parameter estimation is done by a Markov Chain Monte Carlo approach we have
access to credibility bands allowing for a better interpretation of the
reliability of the results. By introducing a Bayesian linear segment fit it is
possible to give an estimate for the time horizon in which the transition will
probably occur based on the current state of information. This approach is also
able to handle nonlinear time dependencies of the parameter controlling the
transition. In general the method could be used as a tool for on-line analysis
to detect changes in the resilience of the system and to provide information on
the probability of the occurrence of a critical transition in future.Comment: 13 pages, 6 figures, 1 tabl
Efficient Bayesian estimation of the generalized Langevin equation from data
The generalized Langevin equation (GLE) overcomes the limiting Markov
approximation of the Langevin equation by an incorporated memory kernel and can
be used to model various stochastic processes in many fields of science ranging
from climate modeling over neuroscience to finance. Generally, Bayesian
estimation facilitates the determination of both suitable model parameters and
their credibility for a measured time series in a straightforward way. In this
work we develop a realization of this estimation technique for the GLE in the
case of white noise. We assume piecewise constant drift and diffusion functions
and represent the characteristics of the data set by only a few coefficients,
which leads to a numerically efficient procedure. The kernel function is an
arbitrary time-discrete function with a fixed length . We show how to
determine a reasonable value of based on the data. We illustrate the
abilities of both the method and the model by an example from turbulence
Anticipation of Oligocene's climate heartbeat by simplified eigenvalue estimation
The Eocene-Oligocene transition marks a watershed point of earth's climate
history. The climate shifts from a greenhouse state to an icehouse state in
which Antarctica glaciated for the first time and periodic dynamics arise which
are still relevant for our current climate. We analyse a concentration
time series which covers the Eocene-Oligocene transition and which is obtained
from a Pacific sediment core at site DSDP1218. Therefore, we introduce a
simplified autoregression-based variant of the dominant eigenvalue (DEV)
estimation procedure. The DEV works as leading indicator of bifurcation-induced
transitions and enables us to identify the bifurcation type. We confirm its
reliability in a methodological study and demonstrate the crucial importance of
proper detrending to obtain unbiased results. As a remark, we discuss also
possible pathways to estimate the stability of limit cycles based on the DEV
and the alternative drift slope as a proof of principle. Finally, we present
the DEV analysis results of the concentration time series which are
reproducible in a wide parameter range. Our findings demonstrate that the onset
of Oligocene's periodic dynamics might be announced by a Neimark-Sacker/Hopf
bifurcation in course of the Eocene-Oligocene transition 34 mya. (We follow the
convention and use mya"million years ago" and
Ma"million years" throughout the article.)Comment: 14 pages, 6 figures. Appendix included with 14 pages, 13 figures and
2 tables. Total pages: 31. Data and code available onlin
Quantifying Tipping Risks in Power Grids and beyond
Critical transitions, ubiquitous in nature and technology, necessitate
anticipation to avert adverse outcomes. While many studies focus on
bifurcation-induced tipping, where a control parameter change leads to
destabilization, alternative scenarios are conceivable, e.g. noise-induced
tipping by an increasing noise level in a multi-stable system. Although the
generating mechanisms can be different, the observed time series can exhibit
similar characteristics. Therefore, we propose a Bayesian Langevin approach,
implemented in an open-source tool, which is capable of quantifying both
deterministic and intrinsic stochastic dynamics simultaneously. After a
detailed proof of concept, we analyse two bus voltage frequency time series of
the historic North America Western Interconnection blackout on 10th August
1996. Our results unveil the intricate interplay of changing resilience and
noise influence. A comparison with the blackout's timeline supports our
frequency dynamics' Langevin model, with the BL-estimation indicating a
permanent grid state change already two minutes before the officially defined
triggering event. A tree-related high impedance fault or sudden load increases
may serve as earlier triggers during this event, as suggested by our findings.
This study underscores the importance of distinguishing destabilizing factors
for a reliable anticipation of critical transitions, offering a tool for better
understanding such events across various disciplines.Comment: In total: 20 pages, 6 figures. Supplementary material, data and code
available online on github. Enable cross-referencing between main article and
supplement in the same folder by renaming them to
Quantifying_Tipping_Risks.pdf and SI_Quantifying_Tipping_Risks.pdf,
respectivel
Identifying Dominant Industrial Sectors in Market States of the S&P 500 Financial Data
Understanding and forecasting changing market conditions in complex economic
systems like the financial market is of great importance to various
stakeholders such as financial institutions and regulatory agencies. Based on
the finding that the dynamics of sector correlation matrices of the S&P 500
stock market can be described by a sequence of distinct states via a clustering
algorithm, we try to identify the industrial sectors dominating the correlation
structure of each state. For this purpose, we use a method from Explainable
Artificial Intelligence (XAI) on daily S&P 500 stock market data from 1992 to
2012 to assign relevance scores to every feature of each data point. To compare
the significance of the features for the entire data set we develop an
aggregation procedure and apply a Bayesian change point analysis to identify
the most significant sector correlations. We show that the correlation matrix
of each state is dominated only by a few sector correlations. Especially the
energy and IT sector are identified as key factors in determining the state of
the economy. Additionally we show that a reduced surrogate model, using only
the eight sector correlations with the highest XAI-relevance, can replicate 90%
of the cluster assignments. In general our findings imply an additional
dimension reduction of the dynamics of the financial market.Comment: 18 pages and additional appendi
Lagrangian Investigation of Two-Dimensional Decaying Turbulence
We present a numerical investigation of two-dimensional decaying turbulence
in the Lagrangian framework. Focusing on single particle statistics, we
investigate Lagrangian trajectories in a freely evolving turbulent velocity
field. The dynamical evolution of the tracer particles is strongly dominated by
the emergence and evolution of coherent structures. For a statistical analysis
we focus on the Lagrangian acceleration as a central quantity. For more
geometrical aspects we investigate the curvature along the trajectories. We
find strong signatures for self-similar universal behavior
Memory Effects, Multiple Time Scales and Local Stability in Langevin Models of the S&P500 Market Correlation
The analysis of market correlations is crucial for optimal portfolio
selection of correlated assets, but their memory effects have often been
neglected. In this work, we analyse the mean market correlation of the S&P500
which corresponds to the main market mode in principle component analysis. We
fit a generalised Langevin equation (GLE) to the data whose memory kernel
implies that there is a significant memory effect in the market correlation
ranging back at least three trading weeks. The memory kernel improves the
forecasting accuracy of the GLE compared to models without memory and hence,
such a memory effect has to be taken into account for optimal portfolio
selection to minimise risk or for predicting future correlations. Moreover, a
Bayesian resilience estimation provides further evidence for non-Markovianity
in the data and suggests the existence of a hidden slow time scale that
operates on much slower times than the observed daily market data. Assuming
that such a slow time scale exists, our work supports previous research on the
existence of locally stable market states.Comment: 15 pages (excluding references and appendix
- …