26,930 research outputs found

    Necessary and sufficient conditions for analysis and synthesis of markov jump linear systems with incomplete transition descriptions

    Get PDF
    This technical note is concerned with exploring a new approach for the analysis and synthesis for Markov jump linear systems with incomplete transition descriptions. In the study, not all the elements of the transition rate matrices (TRMs) in continuous-time domain, or transition probability matrices (TPMs) in discrete-time domain are assumed to be known. By fully considering the properties of the TRMs and TPMs, and the convexity of the uncertain domains, necessary and sufficient criteria of stability and stabilization are obtained in both continuous and discrete time. Numerical examples are used to illustrate the results. © 2006 IEEE.published_or_final_versio

    Stability and Stabilization of Continuous-Time Markovian Jump Singular Systems with Partly Known Transition Probabilities

    Get PDF
    This paper investigates the problem of the stability and stabilization of continuous-time Markovian jump singular systems with partial information on transition probabilities. A new stability criterion which is necessary and sufficient is obtained for these systems. Furthermore, sufficient conditions for the state feedback controller design are derived in terms of linear matrix inequalities. Finally, numerical examples are given to illustrate the effectiveness of the proposed methods

    On stabilization of bilinear uncertain time-delay stochastic systems with Markovian jumping parameters

    Get PDF
    Copyright [2002] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.In this paper, we investigate the stochastic stabilization problem for a class of bilinear continuous time-delay uncertain systems with Markovian jumping parameters. Specifically, the stochastic bilinear jump system under study involves unknown state time-delay, parameter uncertainties, and unknown nonlinear deterministic disturbances. The jumping parameters considered here form a continuous-time discrete-state homogeneous Markov process. The whole system may be regarded as a stochastic bilinear hybrid system that includes both time-evolving and event-driven mechanisms. Our attention is focused on the design of a robust state-feedback controller such that, for all admissible uncertainties as well as nonlinear disturbances, the closed-loop system is stochastically exponentially stable in the mean square, independent of the time delay. Sufficient conditions are established to guarantee the existence of desired robust controllers, which are given in terms of the solutions to a set of either linear matrix inequalities (LMIs), or coupled quadratic matrix inequalities. The developed theory is illustrated by numerical simulatio

    Stochastic stability and stabilization of discrete-time singular Markovian jump systems with partially unknown transition probabilities

    Get PDF
    This paper considers the stochastic stability and stabilization of discrete-time singular Markovian jump systems with partially unknown transition probabilities. Firstly, a set of necessary and sufficient conditions for the stochastic stability is proposed in terms of LMIs, then a set of sufficient conditions is proposed for the design of a state feedback controller to guarantee that the corresponding closed-loop systems are regular, causal, and stochastically stable by employing the LMI technique. Finally, some examples are provided to demonstrate the effectiveness of the proposed approaches

    Almost Sure Stabilization for Adaptive Controls of Regime-switching LQ Systems with A Hidden Markov Chain

    Full text link
    This work is devoted to the almost sure stabilization of adaptive control systems that involve an unknown Markov chain. The control system displays continuous dynamics represented by differential equations and discrete events given by a hidden Markov chain. Different from previous work on stabilization of adaptive controlled systems with a hidden Markov chain, where average criteria were considered, this work focuses on the almost sure stabilization or sample path stabilization of the underlying processes. Under simple conditions, it is shown that as long as the feedback controls have linear growth in the continuous component, the resulting process is regular. Moreover, by appropriate choice of the Lyapunov functions, it is shown that the adaptive system is stabilizable almost surely. As a by-product, it is also established that the controlled process is positive recurrent
    • …
    corecore