894 research outputs found

    LOW-TEMPERATURE SINTERED (ZnMg)2SiO4 MICROWAVE CERAMICS WITH TiO2 ADDITION AND CALCIUM BOROSILICATE GLASS

    Get PDF
    The low-temperature sintered (ZnMg)2SiO–TiO2 microwave ceramic using CaO–B2O3–SiO2 (CBS) as a sintering aid has been developed. Microwave properties of (Zn1-xMgx)2SiO4 base materials via sol-gel method were highly dependent on the Mg-substituted content. Further, effects of CBS and TiO2 additives on the crystal phases, microstructures and microwave characteristics of (ZnMg)2SiO4 (ZMS) ceramics were investigated. The results indicated that CBS glass could lower the firing temperature of ZMS dielectrics effectively from 1170 to 950°C due to the liquid-phase effect, and significantly improve the sintering behavior and microwave properties of ZMS ceramics. Moreover, ZMS–TiO2 ceramics showed the biphasic structure and the abnormal grain growth was suppressed by the pinning effect of second phase TiO2. Proper amount of TiO2 could tune the large negative temperature coefficient of resonant frequency (tf) of ZMS system to a near zero value. (Zn0.8Mg0.2)2SiO4 codoped with 10 wt.% TiO2 and 3 wt.% CBS sintered at 950°C exhibits the dense microstructure and excellent microwave properties: εr = 9.5, Q·f = 16 600 GHz and tf = −9.6 ppm/°C

    Rethinking Batch Sample Relationships for Data Representation: A Batch-Graph Transformer based Approach

    Full text link
    Exploring sample relationships within each mini-batch has shown great potential for learning image representations. Existing works generally adopt the regular Transformer to model the visual content relationships, ignoring the cues of semantic/label correlations between samples. Also, they generally adopt the "full" self-attention mechanism which are obviously redundant and also sensitive to the noisy samples. To overcome these issues, in this paper, we design a simple yet flexible Batch-Graph Transformer (BGFormer) for mini-batch sample representations by deeply capturing the relationships of image samples from both visual and semantic perspectives. BGFormer has three main aspects. (1) It employs a flexible graph model, termed Batch Graph to jointly encode the visual and semantic relationships of samples within each mini-batch. (2) It explores the neighborhood relationships of samples by borrowing the idea of sparse graph representation which thus performs robustly, w.r.t., noisy samples. (3) It devises a novel Transformer architecture that mainly adopts dual structure-constrained self-attention (SSA), together with graph normalization, FFN, etc, to carefully exploit the batch graph information for sample tokens (nodes) representations. As an application, we apply BGFormer to the metric learning tasks. Extensive experiments on four popular datasets demonstrate the effectiveness of the proposed model

    Robust Transductive Few-shot Learning via Joint Message Passing and Prototype-based Soft-label Propagation

    Full text link
    Few-shot learning (FSL) aims to develop a learning model with the ability to generalize to new classes using a few support samples. For transductive FSL tasks, prototype learning and label propagation methods are commonly employed. Prototype methods generally first learn the representative prototypes from the support set and then determine the labels of queries based on the metric between query samples and prototypes. Label propagation methods try to propagate the labels of support samples on the constructed graph encoding the relationships between both support and query samples. This paper aims to integrate these two principles together and develop an efficient and robust transductive FSL approach, termed Prototype-based Soft-label Propagation (PSLP). Specifically, we first estimate the soft-label presentation for each query sample by leveraging prototypes. Then, we conduct soft-label propagation on our learned query-support graph. Both steps are conducted progressively to boost their respective performance. Moreover, to learn effective prototypes for soft-label estimation as well as the desirable query-support graph for soft-label propagation, we design a new joint message passing scheme to learn sample presentation and relational graph jointly. Our PSLP method is parameter-free and can be implemented very efficiently. On four popular datasets, our method achieves competitive results on both balanced and imbalanced settings compared to the state-of-the-art methods. The code will be released upon acceptance
    • …
    corecore