117,014 research outputs found

    Few-shot classification in Named Entity Recognition Task

    Full text link
    For many natural language processing (NLP) tasks the amount of annotated data is limited. This urges a need to apply semi-supervised learning techniques, such as transfer learning or meta-learning. In this work we tackle Named Entity Recognition (NER) task using Prototypical Network - a metric learning technique. It learns intermediate representations of words which cluster well into named entity classes. This property of the model allows classifying words with extremely limited number of training examples, and can potentially be used as a zero-shot learning method. By coupling this technique with transfer learning we achieve well-performing classifiers trained on only 20 instances of a target class.Comment: In proceedings of the 34th ACM/SIGAPP Symposium on Applied Computin

    Multi-task Learning of Pairwise Sequence Classification Tasks Over Disparate Label Spaces

    Get PDF
    We combine multi-task learning and semi-supervised learning by inducing a joint embedding space between disparate label spaces and learning transfer functions between label embeddings, enabling us to jointly leverage unlabelled data and auxiliary, annotated datasets. We evaluate our approach on a variety of sequence classification tasks with disparate label spaces. We outperform strong single and multi-task baselines and achieve a new state-of-the-art for topic-based sentiment analysis.Comment: To appear at NAACL 2018 (long

    Automatic Segmentation of Land Cover in Satellite Images

    Get PDF
    Semantic segmentation problems such as landcover segmentation rely on large amounts of annotated images to excel. Without such data for target regions, transfer learning methods are widely used to incorporate knowledge from other areas and domains to improve performance. In this study, we analyze the performance of landcover segmentation models trained on low-resolution images with insufficient data for the targeted region or zoom level. In order to boost performance on target data, we experiment with models trained with unsupervised, semi-supervised, and supervised transfer learning approaches, including satellite images from public datasets and other unlabeled sources.According to experimental results, transfer learning improves segmentation performance by 3.4% MIoU (mean intersection over union) in rural regions and 12.9% MIoU in urban regions. We observed that transfer learning is more effective when two datasets share a comparable zoom level and are labeled with identical rules; otherwise, semi-supervised learning is more effective using unlabeled data. Pseudo labeling based unsupervised domain adaptation method improved building detection performance in urban cities. In addition, experiments showed that HRNet outperformed building segmentation approaches in multi-class segmentation

    On Causal and Anticausal Learning

    Get PDF
    We consider the problem of function estimation in the case where an underlying causal model can be inferred. This has implications for popular scenarios such as covariate shift, concept drift, transfer learning and semi-supervised learning. We argue that causal knowledge may facilitate some approaches for a given problem, and rule out others. In particular, we formulate a hypothesis for when semi-supervised learning can help, and corroborate it with empirical results.Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012). arXiv admin note: substantial text overlap with arXiv:1112.273

    Transfer Learning with Semi-Supervised Dataset Annotation for Birdcall Classification

    Full text link
    We present working notes on transfer learning with semi-supervised dataset annotation for the BirdCLEF 2023 competition, focused on identifying African bird species in recorded soundscapes. Our approach utilizes existing off-the-shelf models, BirdNET and MixIT, to address representation and labeling challenges in the competition. We explore the embedding space learned by BirdNET and propose a process to derive an annotated dataset for supervised learning. Our experiments involve various models and feature engineering approaches to maximize performance on the competition leaderboard. The results demonstrate the effectiveness of our approach in classifying bird species and highlight the potential of transfer learning and semi-supervised dataset annotation in similar tasks.Comment: BirdCLEF working note submission to Multimedia Retrieval in Nature (LifeCLEF) for CLEF 202
    • …
    corecore