3,110 research outputs found

    On Multilingual Training of Neural Dependency Parsers

    Full text link
    We show that a recently proposed neural dependency parser can be improved by joint training on multiple languages from the same family. The parser is implemented as a deep neural network whose only input is orthographic representations of words. In order to successfully parse, the network has to discover how linguistically relevant concepts can be inferred from word spellings. We analyze the representations of characters and words that are learned by the network to establish which properties of languages were accounted for. In particular we show that the parser has approximately learned to associate Latin characters with their Cyrillic counterparts and that it can group Polish and Russian words that have a similar grammatical function. Finally, we evaluate the parser on selected languages from the Universal Dependencies dataset and show that it is competitive with other recently proposed state-of-the art methods, while having a simple structure.Comment: preprint accepted into the TSD201

    Memetic cooperative coevolution of Elman recurrent neural networks

    Get PDF
    Cooperative coevolution decomposes an optimi- sation problem into subcomponents and collectively solves them using evolutionary algorithms. Memetic algorithms provides enhancement to evolutionary algorithms with local search. Recently, the incorporation of local search into a memetic cooperative coevolution method has shown to be efficient for training feedforward networks on pattern classification problems. This paper applies the memetic cooperative coevolution method for training recurrent neural networks on grammatical inference problems. The results show that the proposed method achieves better performance in terms of optimisation time and robustness
    • …
    corecore