22,786 research outputs found
Efficient iterative method for solving the Dirac-Kohn-Sham density functional theory
We present for the first time an efficient iterative method to directly solve
the four-component Dirac-Kohn-Sham (DKS) density functional theory. Due to the
existence of the negative energy continuum in the DKS operator, the existing
iterative techniques for solving the Kohn-Sham systems cannot be efficiently
applied to solve the DKS systems. The key component of our method is a novel
filtering step (F) which acts as a preconditioner in the framework of the
locally optimal block preconditioned conjugate gradient (LOBPCG) method. The
resulting method, dubbed the LOBPCG-F method, is able to compute the desired
eigenvalues and eigenvectors in the positive energy band without computing any
state in the negative energy band. The LOBPCG-F method introduces mild extra
cost compared to the standard LOBPCG method and can be easily implemented. We
demonstrate our method in the pseudopotential framework with a planewave basis
set which naturally satisfies the kinetic balance prescription. Numerical
results for Pt, Au, TlF, and BiSe indicate that the
LOBPCG-F method is a robust and efficient method for investigating the
relativistic effect in systems containing heavy elements.Comment: 31 pages, 5 figure
Cycle-Consistent Deep Generative Hashing for Cross-Modal Retrieval
In this paper, we propose a novel deep generative approach to cross-modal
retrieval to learn hash functions in the absence of paired training samples
through the cycle consistency loss. Our proposed approach employs adversarial
training scheme to lean a couple of hash functions enabling translation between
modalities while assuming the underlying semantic relationship. To induce the
hash codes with semantics to the input-output pair, cycle consistency loss is
further proposed upon the adversarial training to strengthen the correlations
between inputs and corresponding outputs. Our approach is generative to learn
hash functions such that the learned hash codes can maximally correlate each
input-output correspondence, meanwhile can also regenerate the inputs so as to
minimize the information loss. The learning to hash embedding is thus performed
to jointly optimize the parameters of the hash functions across modalities as
well as the associated generative models. Extensive experiments on a variety of
large-scale cross-modal data sets demonstrate that our proposed method achieves
better retrieval results than the state-of-the-arts.Comment: To appeared on IEEE Trans. Image Processing. arXiv admin note: text
overlap with arXiv:1703.10593 by other author
- …