12,062 research outputs found

    A Perturbative Density Matrix Renormalization Group Algorithm for Large Active Spaces

    Get PDF
    We describe a low cost alternative to the standard variational DMRG (density matrix renormalization group) algorithm that is analogous to the combination of selected configuration interaction plus perturbation theory (SCI+PT). We denote the resulting method p-DMRG (perturbative DMRG) to distinguish it from the standard variational DMRG. p-DMRG is expected to be useful for systems with very large active spaces, for which variational DMRG becomes too expensive. Similar to SCI+PT, in p-DMRG a zeroth-order wavefunction is first obtained by a standard DMRG calculation, but with a small bond dimension. Then, the residual correlation is recovered by a second-order perturbative treatment. We discuss the choice of partitioning for the perturbation theory, which is crucial for its accuracy and robustness. To circumvent the problem of a large bond dimension in the first-order wavefunction, we use a sum of matrix product states (MPS) to expand the first-order wavefunction, yielding substantial savings in computational cost and memory. We also propose extrapolation schemes to reduce the errors in the zeroth- and first-order wavefunctions. Numerical results for Cr 2 with a (28e,76o) active space and 1,3-butadiene with a (22e,82o) active space reveal that p-DMRG provides ground state energies of a similar quality to variational DMRG with very large bond dimensions, but at a significantly lower computational cost. This suggests that p-DMRG will be an efficient tool for benchmark studies in the future

    Wavelet Integrated CNNs for Noise-Robust Image Classification

    Full text link
    Convolutional Neural Networks (CNNs) are generally prone to noise interruptions, i.e., small image noise can cause drastic changes in the output. To suppress the noise effect to the final predication, we enhance CNNs by replacing max-pooling, strided-convolution, and average-pooling with Discrete Wavelet Transform (DWT). We present general DWT and Inverse DWT (IDWT) layers applicable to various wavelets like Haar, Daubechies, and Cohen, etc., and design wavelet integrated CNNs (WaveCNets) using these layers for image classification. In WaveCNets, feature maps are decomposed into the low-frequency and high-frequency components during the down-sampling. The low-frequency component stores main information including the basic object structures, which is transmitted into the subsequent layers to extract robust high-level features. The high-frequency components, containing most of the data noise, are dropped during inference to improve the noise-robustness of the WaveCNets. Our experimental results on ImageNet and ImageNet-C (the noisy version of ImageNet) show that WaveCNets, the wavelet integrated versions of VGG, ResNets, and DenseNet, achieve higher accuracy and better noise-robustness than their vanilla versions.Comment: CVPR accepted pape

    SADIH: Semantic-Aware DIscrete Hashing

    Full text link
    Due to its low storage cost and fast query speed, hashing has been recognized to accomplish similarity search in large-scale multimedia retrieval applications. Particularly supervised hashing has recently received considerable research attention by leveraging the label information to preserve the pairwise similarities of data points in the Hamming space. However, there still remain two crucial bottlenecks: 1) the learning process of the full pairwise similarity preservation is computationally unaffordable and unscalable to deal with big data; 2) the available category information of data are not well-explored to learn discriminative hash functions. To overcome these challenges, we propose a unified Semantic-Aware DIscrete Hashing (SADIH) framework, which aims to directly embed the transformed semantic information into the asymmetric similarity approximation and discriminative hashing function learning. Specifically, a semantic-aware latent embedding is introduced to asymmetrically preserve the full pairwise similarities while skillfully handle the cumbersome n times n pairwise similarity matrix. Meanwhile, a semantic-aware autoencoder is developed to jointly preserve the data structures in the discriminative latent semantic space and perform data reconstruction. Moreover, an efficient alternating optimization algorithm is proposed to solve the resulting discrete optimization problem. Extensive experimental results on multiple large-scale datasets demonstrate that our SADIH can clearly outperform the state-of-the-art baselines with the additional benefit of lower computational costs.Comment: Accepted by The Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19
    • …
    corecore