108,586 research outputs found

    Transfer learning of language-independent end-to-end ASR with language model fusion

    Full text link
    This work explores better adaptation methods to low-resource languages using an external language model (LM) under the framework of transfer learning. We first build a language-independent ASR system in a unified sequence-to-sequence (S2S) architecture with a shared vocabulary among all languages. During adaptation, we perform LM fusion transfer, where an external LM is integrated into the decoder network of the attention-based S2S model in the whole adaptation stage, to effectively incorporate linguistic context of the target language. We also investigate various seed models for transfer learning. Experimental evaluations using the IARPA BABEL data set show that LM fusion transfer improves performances on all target five languages compared with simple transfer learning when the external text data is available. Our final system drastically reduces the performance gap from the hybrid systems.Comment: Accepted at ICASSP201

    Unifying Multiple Knowledge Domains Using the ARTMAP Information Fusion System

    Full text link
    Sensors working at different times, locations, and scales, and experts with different goals, languages, and situations, may produce apparently inconsistent image labels that are reconciled by their implicit underlying relationships. Even when such relationships are unknown to the user, an ARTMAP information fusion system discovers a hierarchical knowledge structure for a labeled dataset. The present paper addresses the problem of integrating two or more independent knowledge hierarchies based on the same low-level classes. The new system fuses independent domains into a unified knowledge structure, discovering cross-domain rules in this process. The system infers multi-level relationships among groups of output classes, without any supervised labeling of these relationships. In order to self-organize its expert system, ARTMAP information fusion system features distributed code representations that exploit the neural network’s capacity for one-to-many learning. The fusion system software and testbed datasets are available from http://cns.bu.edu/techlabNational Science Foundation (SBE-0354378); National Geospatial-Intelligence Agency (NMA 201-01-1-2016

    Native Language Identification on Text and Speech

    Full text link
    This paper presents an ensemble system combining the output of multiple SVM classifiers to native language identification (NLI). The system was submitted to the NLI Shared Task 2017 fusion track which featured students essays and spoken responses in form of audio transcriptions and iVectors by non-native English speakers of eleven native languages. Our system competed in the challenge under the team name ZCD and was based on an ensemble of SVM classifiers trained on character n-grams achieving 83.58% accuracy and ranking 3rd in the shared task.Comment: Proceedings of the Workshop on Innovative Use of NLP for Building Educational Applications (BEA

    The origin of the reflexive prefix in Rgyalrong languages

    Get PDF
    International audienceIn the Sino-Tibetan family, reflexivity is either not expressed on the verb as in Chinese or Tibetan or expressed by means of a ‘middle' marker as in Dulong or Kiranti languages. Among the morphologically rich languages of this family, only Rgyalrong languages have distinct and unambiguous reflexive and reciprocal markers on the verb. This paper shows that the reflexive prefix in Rgyalrong languages has two possible origins. It could come from a fusion of the third person singular marker and the root meaning ‘self' or alternatively come from the free third person pronoun. Both hypotheses are compatible with our understanding of Rgyalrong historical phonology

    Large-Scale Neural Systems for Vision and Cognition

    Full text link
    — Consideration of how people respond to the question What is this? has suggested new problem frontiers for pattern recognition and information fusion, as well as neural systems that embody the cognitive transformation of declarative information into relational knowledge. In contrast to traditional classification methods, which aim to find the single correct label for each exemplar (This is a car), the new approach discovers rules that embody coherent relationships among labels which would otherwise appear contradictory to a learning system (This is a car, that is a vehicle, over there is a sedan). This talk will describe how an individual who experiences exemplars in real time, with each exemplar trained on at most one category label, can autonomously discover a hierarchy of cognitive rules, thereby converting local information into global knowledge. Computational examples are based on the observation that sensors working at different times, locations, and spatial scales, and experts with different goals, languages, and situations, may produce apparently inconsistent image labels, which are reconciled by implicit underlying relationships that the network’s learning process discovers. The ARTMAP information fusion system can, moreover, integrate multiple separate knowledge hierarchies, by fusing independent domains into a unified structure. In the process, the system discovers cross-domain rules, inferring multilevel relationships among groups of output classes, without any supervised labeling of these relationships. In order to self-organize its expert system, the ARTMAP information fusion network features distributed code representations which exploit the model’s intrinsic capacity for one-to-many learning (This is a car and a vehicle and a sedan) as well as many-to-one learning (Each of those vehicles is a car). Fusion system software, testbed datasets, and articles are available from http://cns.bu.edu/techlab.Defense Advanced Research Projects Research Agency (Hewlett-Packard Company, DARPA HR0011-09-3-0001; HRL Laboratories LLC subcontract 801881-BS under prime contract HR0011-09-C-0011); Science of Learning Centers program of the National Science Foundation (SBE-0354378

    Sundanese Nasal Substitution: An Optimality Theoretic Analysis

    Get PDF
    corecore