12 research outputs found

    BTAN: Lightweight Super-Resolution Network with Target Transform and Attention

    Get PDF
    In the realm of single-image super-resolution (SISR), generating high-resolution (HR) images from a low-resolution (LR) input remains a challenging task. While deep neural networks have shown promising results, they often require significant computational resources. To address this issue, we introduce a lightweight convolutional neural network, named BTAN, that leverages the connection between LR and HR images to enhance performance without increasing the number of parameters. Our approach includes a target transform module that adjusts output features to match the target distribution and improve reconstruction quality, as well as a spatial and channel-wise attention module that modulates feature maps based on visual attention at multiple layers. We demonstrate the effectiveness of our approach on four benchmark datasets, showcasing superior accuracy, efficiency, and visual quality when compared to state-of-the-art methods.    

    Deep Lifelong Cross-modal Hashing

    Full text link
    Hashing methods have made significant progress in cross-modal retrieval tasks with fast query speed and low storage cost. Among them, deep learning-based hashing achieves better performance on large-scale data due to its excellent extraction and representation ability for nonlinear heterogeneous features. However, there are still two main challenges in catastrophic forgetting when data with new categories arrive continuously, and time-consuming for non-continuous hashing retrieval to retrain for updating. To this end, we, in this paper, propose a novel deep lifelong cross-modal hashing to achieve lifelong hashing retrieval instead of re-training hash function repeatedly when new data arrive. Specifically, we design lifelong learning strategy to update hash functions by directly training the incremental data instead of retraining new hash functions using all the accumulated data, which significantly reduce training time. Then, we propose lifelong hashing loss to enable original hash codes participate in lifelong learning but remain invariant, and further preserve the similarity and dis-similarity among original and incremental hash codes to maintain performance. Additionally, considering distribution heterogeneity when new data arriving continuously, we introduce multi-label semantic similarity to supervise hash learning, and it has been proven that the similarity improves performance with detailed analysis. Experimental results on benchmark datasets show that the proposed methods achieves comparative performance comparing with recent state-of-the-art cross-modal hashing methods, and it yields substantial average increments over 20\% in retrieval accuracy and almost reduces over 80\% training time when new data arrives continuously

    Research on the Machinability of Micro-Tapered Hole Group in Piezoelectric Atomizer and the Improvement Method

    No full text
    A metal atomizing sheet with a group of micro-tapered holes is the core constituent of a piezoelectric atomizer. However, the diameters of large-end and small-end micro-tapered holes in industrial applications deviate from the design values by 15.25% and 15.83%, respectively, which adversely impacts the effect of atomizers. In this study, two main factors that influence the machining quality of tapered holes, the external vibration disturbance and the internal system errors inside the laser processor, were explored; consequently, the vibration model of the machining device and the laser drilling model were established, respectively. Based on the models and the experimental results, it was found that the errors in diameter caused by these two factors accounted for 20% and 67.87% of the total deviation, respectively. Finally, an improved method was proposed, where a damping system was added to the machining device, and the diameter of the initial laser spot was corrected. The measurement results of tapered holes machined by the improved method showed that the deviation of the large diameters and the small diameters from the design values declined to 4.85% and 4.83%, respectively. This study lays a foundation for the high-precision and large-scale industry of atomizing sheets, and provides a new research direction for enhancing the performance of atomizers
    corecore