3 research outputs found

    Human action recognition using deep rule-based classifier

    Get PDF
    In recent years, numerous techniques have been proposed for human activity recognition (HAR) from images and videos. These techniques can be divided into two major categories: handcrafted and deep learning. Deep Learning-based models have produced remarkable results for HAR. However, these models have several shortcomings, such as the requirement for a massive amount of training data, lack of transparency, offline nature, and poor interpretability of their internal parameters. In this paper, a new approach for HAR is proposed, which consists of an interpretable, self-evolving, and self-organizing set of 0-order If...THEN rules. This approach is entirely data-driven, and non-parametric; thus, prototypes are identified automatically during the training process. To demonstrate the effectiveness of the proposed method, a set of high-level features is obtained using a pre-trained deep convolution neural network model, and a recently introduced deep rule-based classifier is applied for classification. Experiments are performed on a challenging benchmark dataset UCF50; results confirmed that the proposed approach outperforms state-of-the-art methods. In addition to this, an ablation study is conducted to demonstrate the efficacy of the proposed approach by comparing the performance of our DRB classifier with four state-of-the-art classifiers. This analysis revealed that the DRB classifier could perform better than state-of-the-art classifiers, even with limited training samples

    Review of trends in automatic human activity recognition using synthetic audio-visual data

    No full text
    An in-depth study of knowledge and technologies was made related to the various scientific, technical, and industrial domains necessary for the acquisition of skills and capabilities for the design and development of a multisensory fusion system for vehicle cockpits. After an extensive literature review, it was possible to determine the baselines of the solution to be developed and obtain a pipeline prototype.This work has been supported by FCT – Fundação para a Ciência e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020. Human and material resources have also been supported by the European Structural and Investment Funds in the FEDER component, through the Operational Competitiveness and Internationalization Programme (COMPETE 2020) [Project number 039334; Funding Reference: POCI-01-0247-FEDER-039334]
    corecore