383 research outputs found

    Rumba : a Python framework for automating large-scale recursive internet experiments on GENI and FIRE+

    Get PDF
    It is not easy to design and run Convolutional Neural Networks (CNNs) due to: 1) finding the optimal number of filters (i.e., the width) at each layer is tricky, given an architecture; and 2) the computational intensity of CNNs impedes the deployment on computationally limited devices. Oracle Pruning is designed to remove the unimportant filters from a well-trained CNN, which estimates the filters’ importance by ablating them in turn and evaluating the model, thus delivers high accuracy but suffers from intolerable time complexity, and requires a given resulting width but cannot automatically find it. To address these problems, we propose Approximated Oracle Filter Pruning (AOFP), which keeps searching for the least important filters in a binary search manner, makes pruning attempts by masking out filters randomly, accumulates the resulting errors, and finetunes the model via a multi-path framework. As AOFP enables simultaneous pruning on multiple layers, we can prune an existing very deep CNN with acceptable time cost, negligible accuracy drop, and no heuristic knowledge, or re-design a model which exerts higher accuracy and faster inferenc

    Modality Adaption or Regularization? A Case Study on End-to-End Speech Translation

    Full text link
    Pre-training and fine-tuning is a paradigm for alleviating the data scarcity problem in end-to-end speech translation (E2E ST). The commonplace "modality gap" between speech and text data often leads to inconsistent inputs between pre-training and fine-tuning. However, we observe that this gap occurs in the early stages of fine-tuning, but does not have a major impact on the final performance. On the other hand, we find that there has another gap, which we call the "capacity gap": high resource tasks (such as ASR and MT) always require a large model to fit, when the model is reused for a low resource task (E2E ST), it will get a sub-optimal performance due to the over-fitting. In a case study, we find that the regularization plays a more important role than the well-designed modality adaption method, which achieves 29.0 for en-de and 40.3 for en-fr on the MuST-C dataset. Code and models are available at https://github.com/hannlp/TAB.Comment: ACL 2023 Main Conferenc

    The globally widespread genus Sulfurimonas: versatile energy metabolisms and adaptations to redox clines

    Get PDF
    Sulfurimonas species are commonly isolated from sulfidic habitats and numerous 16S rRNA sequences related to Sulfurimonas species have been identified in chemically distinct environments, such as hydrothermal deep-sea vents, marine sediments, the ocean’s water column, and terrestrial habitats. In some of these habitats, Sulfurimonas have been demonstrated to play an important role in chemoautotrophic processes. Sulfurimonas species can grow with a variety of electron donors and acceptors, which may contribute to their widespread distribution. Multiple copies of one type of enzyme (e.g., sulfide:quinone reductases and hydrogenases) may play a pivotal role in Sulfurimonas’ flexibility to colonize disparate environments. Many of these genes appear to have been acquired through horizontal gene transfer which has promoted adaptations to the distinct habitats. Here we summarize Sulfurimonas’ versatile energy metabolisms and link their physiological properties to their global distribution
    • …
    corecore