1,226 research outputs found
On stabilization of bilinear uncertain time-delay stochastic systems with Markovian jumping parameters
Copyright [2002] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.In this paper, we investigate the stochastic stabilization problem for a class of bilinear continuous time-delay uncertain systems with Markovian jumping parameters. Specifically, the stochastic bilinear jump system under study involves unknown state time-delay, parameter uncertainties, and unknown nonlinear deterministic disturbances. The jumping parameters considered here form a continuous-time discrete-state homogeneous Markov process. The whole system may be regarded as a stochastic bilinear hybrid system that includes both time-evolving and event-driven mechanisms. Our attention is focused on the design of a robust state-feedback controller such that, for all admissible uncertainties as well as nonlinear disturbances, the closed-loop system is stochastically exponentially stable in the mean square, independent of the time delay. Sufficient conditions are established to guarantee the existence of desired robust controllers, which are given in terms of the solutions to a set of either linear matrix inequalities (LMIs), or coupled quadratic matrix inequalities. The developed theory is illustrated by numerical simulatio
Almost Sure Stabilization for Adaptive Controls of Regime-switching LQ Systems with A Hidden Markov Chain
This work is devoted to the almost sure stabilization of adaptive control
systems that involve an unknown Markov chain. The control system displays
continuous dynamics represented by differential equations and discrete events
given by a hidden Markov chain. Different from previous work on stabilization
of adaptive controlled systems with a hidden Markov chain, where average
criteria were considered, this work focuses on the almost sure stabilization or
sample path stabilization of the underlying processes. Under simple conditions,
it is shown that as long as the feedback controls have linear growth in the
continuous component, the resulting process is regular. Moreover, by
appropriate choice of the Lyapunov functions, it is shown that the adaptive
system is stabilizable almost surely. As a by-product, it is also established
that the controlled process is positive recurrent
Stabilizing Randomly Switched Systems
This article is concerned with stability analysis and stabilization of
randomly switched systems under a class of switching signals. The switching
signal is modeled as a jump stochastic (not necessarily Markovian) process
independent of the system state; it selects, at each instant of time, the
active subsystem from a family of systems. Sufficient conditions for stochastic
stability (almost sure, in the mean, and in probability) of the switched system
are established when the subsystems do not possess control inputs, and not
every subsystem is required to be stable. These conditions are employed to
design stabilizing feedback controllers when the subsystems are affine in
control. The analysis is carried out with the aid of multiple Lyapunov-like
functions, and the analysis results together with universal formulae for
feedback stabilization of nonlinear systems constitute our primary tools for
control designComment: 22 pages. Submitte
Estimation and control of non-linear and hybrid systems with applications to air-to-air guidance
Issued as Progress report, and Final report, Project no. E-21-67
- …