6,793 research outputs found
Global exponential stability for coupled systems of neutral delay differential equations
In this paper, a novel class of neutral delay differential equations (NDDEs) is presented. By using the Razumikhin method and Kirchhoff's matrix tree theorem in graph theory, the global exponential stability for such NDDEs is investigated. By constructing an appropriate Lyapunov function, two different kinds of sufficient criteria which ensure the global exponential stability of NDDEs are derived in the form of Lyapunov functions and coefficients of NDDEs, respectively. A numerical example is provided to demonstrate the effectiveness of the theoretical results
No spin-localization phase transition in the spin-boson model without local field
We explore the spin-boson model in a special case, i.e., with zero local
field. In contrast to previous studies, we find no possibility for quantum
phase transition (QPT) happening between the localized and delocalized phases,
and the behavior of the model can be fully characterized by the even or odd
parity as well as the parity breaking, instead of the QPT, owned by the ground
state of the system. Our analytical treatment about the eigensolution of the
ground state of the model presents for the first time a rigorous proof of
no-degeneracy for the ground state of the model, which is independent of the
bath type, the degrees of freedom of the bath and the calculation precision. We
argue that the QPT mentioned previously appears due to unreasonable treatment
of the ground state of the model or of the infrared divergence existing in the
spectral functions for Ohmic and sub-Ohmic dissipations.Comment: 5 pages, 1 figure. Comments are welcom
Nest-DGIL: Nesterov-optimized Deep Geometric Incremental Learning for CS Image Reconstruction
Proximal gradient-based optimization is one of the most common strategies for
solving image inverse problems as well as easy to implement. However, these
techniques often generate heavy artifacts in image reconstruction. One of the
most popular refinement methods is to fine-tune the regularization parameter to
alleviate such artifacts, but it may not always be sufficient or applicable due
to increased computational costs. In this work, we propose a deep geometric
incremental learning framework based on second Nesterov proximal gradient
optimization. The proposed end-to-end network not only has the powerful
learning ability for high/low frequency image features,but also can
theoretically guarantee that geometric texture details will be reconstructed
from preliminary linear reconstruction.Furthermore, it can avoid the risk of
intermediate reconstruction results falling outside the geometric decomposition
domains and achieve fast convergence. Our reconstruction framework is
decomposed into four modules including general linear reconstruction, cascade
geometric incremental restoration, Nesterov acceleration and post-processing.
In the image restoration step,a cascade geometric incremental learning module
is designed to compensate for the missing texture information from different
geometric spectral decomposition domains. Inspired by overlap-tile strategy, we
also develop a post-processing module to remove the block-effect in
patch-wise-based natural image reconstruction. All parameters in the proposed
model are learnable,an adaptive initialization technique of physical-parameters
is also employed to make model flexibility and ensure converging smoothly. We
compare the reconstruction performance of the proposed method with existing
state-of-the-art methods to demonstrate its superiority. Our source codes are
available at https://github.com/fanxiaohong/Nest-DGIL.Comment: 15 page
- …