4,643 research outputs found
The long-time asymptotic of the derivative nonlinear Schrdinger equation with step-like initial value
Consideration in this present paper is the long-time asymptotic of solutions
to the derivative nonlinear Schrdinger equation with the step-like
initial value \begin{eqnarray} q(x,0)=q_{0}(x)=\begin{cases} \begin{split}
A_{1}e^{i\phi}e^{2iBx}, \quad\quad x0.
\end{split}\nonumber \end{cases} \end{eqnarray} by Deift-Zhou method. The
step-like initial problem described by a matrix Riemann-Hilbert problem. A
crucial ingredient used in this paper is to introduce -function mechanism
for solving the problem of the entries of the jump matrix growing exponentially
as . It is shown that the leading order term of the
asymptotic solution of the DNLS equation expressed by the Theta function
about the Riemann-surface of genus 3 and the subleading order term
expressed by parabolic cylinder and Airy functions
Converting normal insulators into topological insulators via tuning orbital levels
Tuning the spin-orbit coupling strength via foreign element doping and/or
modifying bonding strength via strain engineering are the major routes to
convert normal insulators to topological insulators. We here propose an
alternative strategy to realize topological phase transition by tuning the
orbital level. Following this strategy, our first-principles calculations
demonstrate that a topological phase transition in some cubic perovskite-type
compounds CsGeBr and CsSnBr could be facilitated by carbon
substitutional doping. Such unique topological phase transition predominantly
results from the lower orbital energy of the carbon dopant, which can pull down
the conduction bands and even induce band inversion. Beyond conventional
approaches, our finding of tuning the orbital level may greatly expand the
range of topologically nontrivial materials
Discriminative Block-Diagonal Representation Learning for Image Recognition
Existing block-diagonal representation studies mainly focuses on casting block-diagonal regularization on training data, while only little attention is dedicated to concurrently learning both block-diagonal representations of training and test data. In this paper, we propose a discriminative block-diagonal low-rank representation (BDLRR) method for recognition. In particular, the elaborate BDLRR is formulated as a joint optimization problem of shrinking the unfavorable representation from off-block-diagonal elements and strengthening the compact block-diagonal representation under the semisupervised framework of LRR. To this end, we first impose penalty constraints on the negative representation to eliminate the correlation between different classes such that the incoherence criterion of the extra-class representation is boosted. Moreover, a constructed subspace model is developed to enhance the self-expressive power of training samples and further build the representation bridge between the training and test samples, such that the coherence of the learned intraclass representation is consistently heightened. Finally, the resulting optimization problem is solved elegantly by employing an alternative optimization strategy, and a simple recognition algorithm on the learned representation is utilized for final prediction. Extensive experimental results demonstrate that the proposed method achieves superb recognition results on four face image data sets, three character data sets, and the 15 scene multicategories data set. It not only shows superior potential on image recognition but also outperforms the state-of-the-art methods
- …