70,696 research outputs found

    Sign language recognition with transformer networks

    Get PDF
    Sign languages are complex languages. Research into them is ongoing, supported by large video corpora of which only small parts are annotated. Sign language recognition can be used to speed up the annotation process of these corpora, in order to aid research into sign languages and sign language recognition. Previous research has approached sign language recognition in various ways, using feature extraction techniques or end-to-end deep learning. In this work, we apply a combination of feature extraction using OpenPose for human keypoint estimation and end-to-end feature learning with Convolutional Neural Networks. The proven multi-head attention mechanism used in transformers is applied to recognize isolated signs in the Flemish Sign Language corpus. Our proposed method significantly outperforms the previous state of the art of sign language recognition on the Flemish Sign Language corpus: we obtain an accuracy of 74.7% on a vocabulary of 100 classes. Our results will be implemented as a suggestion system for sign language corpus annotation

    Adjustment of model parameters to estimate distribution transformers remaining lifespan

    Get PDF
    Currently, the electrical system in Argentina is working at its maximum capacity, decreasing the margin between the installed power and demanded consumption, and drastically reducing the service life of transformer substations due to overload (since the margin for summer peaks is small). The advent of the Smart Grids allows electricity distribution companies to apply data analysis techniques to manage resources more efficiently at different levels (avoiding damages, better contingency management, maintenance planning, etc.). The Smart Grids in Argentina progresses slowly due to the high costs involved. In this context, the estimation of the lifespan reduction of distribution transformers is a key tool to efficiently manage human and material resources, maximizing the lifetime of this equipment. Despite the current state of the smart grids, the electricity distribution companies can implement it using the available data. Thermal models provide guidelines for lifespan estimation, but the adjustment to particular conditions, brands, or material quality is done by adjusting parameters. In this work we propose a method to adjust the parameters of a thermal model using Genetic Algorithms, comparing the estimation values of top-oil temperature with measurements from 315 kVA distribution transformers, located in the province of Tucumán, Argentina. The results show that, despite limited data availability, the adjusted model is suitable to implement a transformer monitoring system.Fil: Jimenez, Victor Adrian. Universidad Tecnológica Nacional. Facultad Regional Tucumán. Centro de Investigación en Tecnologías Avanzadas de Tucumán; ArgentinaFil: Will, Adrian L. E.. Universidad Tecnológica Nacional. Facultad Regional Tucumán. Centro de Investigación en Tecnologías Avanzadas de Tucumán; ArgentinaFil: Gotay Sardiñas, Jorge. Universidad Tecnológica Nacional. Facultad Regional Tucumán. Centro de Investigación en Tecnologías Avanzadas de Tucumán; ArgentinaFil: Rodriguez, Sebastian Alberto. Universidad Tecnológica Nacional. Facultad Regional Tucumán. Centro de Investigación en Tecnologías Avanzadas de Tucumán; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - Tucumán; Argentin

    The role of intelligent systems in delivering the smart grid

    Get PDF
    The development of "smart" or "intelligent" energy networks has been proposed by both EPRI's IntelliGrid initiative and the European SmartGrids Technology Platform as a key step in meeting our future energy needs. A central challenge in delivering the energy networks of the future is the judicious selection and development of an appropriate set of technologies and techniques which will form "a toolbox of proven technical solutions". This paper considers functionality required to deliver key parts of the Smart Grid vision of future energy networks. The role of intelligent systems in providing these networks with the requisite decision-making functionality is discussed. In addition to that functionality, the paper considers the role of intelligent systems, in particular multi-agent systems, in providing flexible and extensible architectures for deploying intelligence within the Smart Grid. Beyond exploiting intelligent systems as architectural elements of the Smart Grid, with the purpose of meeting a set of engineering requirements, the role of intelligent systems as a tool for understanding what those requirements are in the first instance, is also briefly discussed

    Language Modeling with Deep Transformers

    Full text link
    We explore deep autoregressive Transformer models in language modeling for speech recognition. We focus on two aspects. First, we revisit Transformer model configurations specifically for language modeling. We show that well configured Transformer models outperform our baseline models based on the shallow stack of LSTM recurrent neural network layers. We carry out experiments on the open-source LibriSpeech 960hr task, for both 200K vocabulary word-level and 10K byte-pair encoding subword-level language modeling. We apply our word-level models to conventional hybrid speech recognition by lattice rescoring, and the subword-level models to attention based encoder-decoder models by shallow fusion. Second, we show that deep Transformer language models do not require positional encoding. The positional encoding is an essential augmentation for the self-attention mechanism which is invariant to sequence ordering. However, in autoregressive setup, as is the case for language modeling, the amount of information increases along the position dimension, which is a positional signal by its own. The analysis of attention weights shows that deep autoregressive self-attention models can automatically make use of such positional information. We find that removing the positional encoding even slightly improves the performance of these models.Comment: To appear in the proceedings of INTERSPEECH 201
    corecore