Hierarchical human activity recognition with fusion of audio and multiple inertial sensor modalities

Abstract

In everyday life, individuals engage in a multitude of activities, and recent technological advancements have facilitated the development of Artificial Intelligence systems that analyse these activities through data in various contexts. Human activity recognition, an essential Artificial Intelligence application in healthcare, detects deviations from normal activities, such as falls, which may indicate health issues. The widespread adoption of mobile and wearable technology enables the development of personalized activity recognition solutions. Given the critical importance of these Artificial Intelligence applications in healthcare, designing systems with high recognition accuracy is imperative. Moreover, these systems must be lightweight to ensure they operate seamlessly on end-user devices like smartphones without compromising their primary functions. Our research introduces innovative input representations and advanced methods in data fusion and multimodal learning that surpass previous methods in recognition accuracy while reducing computational and memory demands. We propose a streamlined neural network model that creatively integrates inertial and audio sensor data to generate color-coded image representations. This implemented Artificial Intelligence system was tested using a complex, publicly available activity recognition dataset organized hierarchically. Compared to earlier studies, our solution demonstrates significant performance enhancements, achieving a 91% balanced accuracy rate in activity recognition and offering substantial improvements in Central Processing Unit and memory efficiency

Similar works

Full text

thumbnail-image

Queen's University Belfast Research Portal

redirect
Last time updated on 29/01/2026

This paper was published in Queen's University Belfast Research Portal.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.

Licence: http://creativecommons.org/licenses/by-nc-nd/4.0/