3,888 research outputs found

    Stability Analysis of Systems With Time-Varying Delay via Improved Lyapunov-Krasovskii Functionals

    Get PDF

    A mathematical framework for critical transitions: normal forms, variance and applications

    Full text link
    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.Comment: minor corrections to previous versio

    Compressive Sensing Using Iterative Hard Thresholding with Low Precision Data Representation: Theory and Applications

    Full text link
    Modern scientific instruments produce vast amounts of data, which can overwhelm the processing ability of computer systems. Lossy compression of data is an intriguing solution, but comes with its own drawbacks, such as potential signal loss, and the need for careful optimization of the compression ratio. In this work, we focus on a setting where this problem is especially acute: compressive sensing frameworks for interferometry and medical imaging. We ask the following question: can the precision of the data representation be lowered for all inputs, with recovery guarantees and practical performance? Our first contribution is a theoretical analysis of the normalized Iterative Hard Thresholding (IHT) algorithm when all input data, meaning both the measurement matrix and the observation vector are quantized aggressively. We present a variant of low precision normalized {IHT} that, under mild conditions, can still provide recovery guarantees. The second contribution is the application of our quantization framework to radio astronomy and magnetic resonance imaging. We show that lowering the precision of the data can significantly accelerate image recovery. We evaluate our approach on telescope data and samples of brain images using CPU and FPGA implementations achieving up to a 9x speed-up with negligible loss of recovery quality.Comment: 19 pages, 5 figures, 1 table, in IEEE Transactions on Signal Processin

    Survey on time-delay approach to networked control

    Get PDF
    This paper provides a survey on time-delay approach to networked control systems (NCSs). The survey begins from a brief summary on fundamental network-induced issues in NCSs and the main approaches to the modelling of NCSs. In particular, a comprehensive introduction to time-delay approach to sampled-data and networked control is provided. Then, recent results on time-delay approach to event-triggered control are recalled. The survey highlights time-delay approach developed to modelling, analysis and synthesis of NCSs, under communication constraints, with a particular focus on Round-Robin, Try-once-discard and stochastic protocols. The time-delay approach allows communication delays to be larger than the sampling intervals in the presence of scheduling protocols. Moreover, some results on networked control of distributed parameter systems are surveyed. Finally, conclusions and some future research directions are briefly addressed

    16th Nordic Process Control Workshop : Preprints

    Get PDF

    Divergence Measures

    Get PDF
    Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_Ă©nyi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures
    • 

    corecore