2,492 research outputs found

    Optimizing Artificial Neural Networks using Cat Swarm Optimization Algorithm

    Full text link

    RASR2: The RWTH ASR Toolkit for Generic Sequence-to-sequence Speech Recognition

    Full text link
    Modern public ASR tools usually provide rich support for training various sequence-to-sequence (S2S) models, but rather simple support for decoding open-vocabulary scenarios only. For closed-vocabulary scenarios, public tools supporting lexical-constrained decoding are usually only for classical ASR, or do not support all S2S models. To eliminate this restriction on research possibilities such as modeling unit choice, we present RASR2 in this work, a research-oriented generic S2S decoder implemented in C++. It offers a strong flexibility/compatibility for various S2S models, language models, label units/topologies and neural network architectures. It provides efficient decoding for both open- and closed-vocabulary scenarios based on a generalized search framework with rich support for different search modes and settings. We evaluate RASR2 with a wide range of experiments on both switchboard and Librispeech corpora. Our source code is public online.Comment: accepted at Interspeech 202

    Extreme Data Mining: Inference from Small Datasets

    Get PDF
    Neural networks have been applied successfully in many fields. However, satisfactory results can only be found under large sample conditions. When it comes to small training sets, the performance may not be so good, or the learning task can even not be accomplished. This deficiency limits the applications of neural network severely. The main reason why small datasets cannot provide enough information is that there exist gaps between samples, even the domain of samples cannot be ensured. Several computational intelligence techniques have been proposed to overcome the limits of learning from small datasets. We have the following goals: i. To discuss the meaning of small in the context of inferring from small datasets. ii. To overview computational intelligence solutions for this problem. iii. To illustrate the introduced concepts with a real-life application

    MINES: Mutual Information Neuro-Evolutionary System

    Get PDF
    Mutual information neuro-evolutionary system (MINES) presents a novel self-governing approach to determine the optimal quantity and connectivity of the hidden layer of a three layer feed-forward neural network founded on theoretical and practical basis. The system is a combination of a feed-forward neural network, back-propagation algorithm, genetic algorithm, mutual information and clustering. Back-propagation is used for parameter learning to reduce the system’s error; while mutual information aides back-propagation to follow an effective path in the weight space. A genetic algorithm changes the incoming synaptic connections of the hidden nodes, based on the fitness provided by the mutual information from the error space to the hidden layer, to perform structural learning. Mutual information determines the appropriate synapses, connecting the hidden nodes to the input layer; however, in effect it also links the back-propagation to the genetic algorithm. Weight clustering is applied to reduce hidden nodes having similar functionality; i.e. those possessing same connectivity patterns and close Euclidean angle in the weight space. Finally, the performance of the system is assessed on two theoretical and one empirical problems. A nonlinear polynomial regression problem and the well known two-spiral classification task are used to evaluate the theoretical performance of the system. Forecasting daily crude oil prices are conducted to observe the performance of MINES on a real world application
    • …
    corecore