43,967 research outputs found
A Region-Dependent Gain Condition for Asymptotic Stability
A sufficient condition for the stability of a system resulting from the
interconnection of dynamical systems is given by the small gain theorem.
Roughly speaking, to apply this theorem, it is required that the gains
composition is continuous, increasing and upper bounded by the identity
function. In this work, an alternative sufficient condition is presented for
the case in which this criterion fails due to either lack of continuity or the
bound of the composed gain is larger than the identity function. More
precisely, the local (resp. non-local) asymptotic stability of the origin
(resp. global attractivity of a compact set) is ensured by a region-dependent
small gain condition. Under an additional condition that implies convergence of
solutions for almost all initial conditions in a suitable domain, the almost
global asymptotic stability of the origin is ensured. Two examples illustrate
and motivate this approach
NAIS-Net: Stable Deep Networks from Non-Autonomous Differential Equations
This paper introduces Non-Autonomous Input-Output Stable Network (NAIS-Net),
a very deep architecture where each stacked processing block is derived from a
time-invariant non-autonomous dynamical system. Non-autonomy is implemented by
skip connections from the block input to each of the unrolled processing stages
and allows stability to be enforced so that blocks can be unrolled adaptively
to a pattern-dependent processing depth. NAIS-Net induces non-trivial,
Lipschitz input-output maps, even for an infinite unroll length. We prove that
the network is globally asymptotically stable so that for every initial
condition there is exactly one input-dependent equilibrium assuming tanh units,
and multiple stable equilibria for ReL units. An efficient implementation that
enforces the stability under derived conditions for both fully-connected and
convolutional layers is also presented. Experimental results show how NAIS-Net
exhibits stability in practice, yielding a significant reduction in
generalization gap compared to ResNets.Comment: NIPS 201
- …