788 research outputs found

    A survey of user-centred approaches for smart home transfer learning and new user home automation adaptation

    Get PDF
    Recent smart home applications enhance the quality of people's home experiences by detecting their daily activities and providing them services that make their daily life more comfortable and safe. Human activity recognition is one of the fundamental tasks that a smart home should accomplish. However, there are still several challenges for such recognition in smart homes, with the target home adaptation process being one of the most critical, since new home environments do not have sufficient data to initiate the necessary activity recognition process. The transfer learning approach is considered the solution to this challenge, due to its ability to improve the adaptation process. This paper endeavours to provide a concrete review of user-centred smart homes along with the recent advancements in transfer learning for activity recognition. Furthermore, the paper proposes an integrated, personalised system that is able to create a dataset for target homes using both survey and transfer learning approaches, providing a personalised dataset based on user preferences and feedback

    A survey of user-centred approaches for smart home transfer learning and new user home automation adaptation

    Get PDF
    Recent smart home applications enhance the quality of people's home experiences by detecting their daily activities and providing them services that make their daily life more comfortable and safe. Human activity recognition is one of the fundamental tasks that a smart home should accomplish. However, there are still several challenges for such recognition in smart homes, with the target home adaptation process being one of the most critical, since new home environments do not have sufficient data to initiate the necessary activity recognition process. The transfer learning approach is considered the solution to this challenge, due to its ability to improve the adaptation process. This paper endeavours to provide a concrete review of user-centred smart homes along with the recent advancements in transfer learning for activity recognition. Furthermore, the paper proposes an integrated, personalised system that is able to create a dataset for target homes using both survey and transfer learning approaches, providing a personalised dataset based on user preferences and feedback

    A hierarchal framework for recognising activities of daily life

    Get PDF
    PhDIn today’s working world the elderly who are dependent can sometimes be neglected by society. Statistically, after toddlers it is the elderly who are observed to have higher accident rates while performing everyday activities. Alzheimer’s disease is one of the major impairments that elderly people suffer from, and leads to the elderly person not being able to live an independent life due to forgetfulness. One way to support elderly people who aspire to live an independent life and remain safe in their home is to find out what activities the elderly person is carrying out at a given time and provide appropriate assistance or institute safeguards. The aim of this research is to create improved methods to identify tasks related to activities of daily life and determine a person’s current intentions and so reason about that person’s future intentions. A novel hierarchal framework has been developed, which recognises sensor events and maps them to significant activities and intentions. As privacy is becoming a growing concern, the monitoring of an individual’s behaviour can be seen as intrusive. Hence, the monitoring is based around using simple non intrusive sensors and tags on everyday objects that are used to perform daily activities around the home. Specifically there is no use of any cameras or visual surveillance equipment, though the techniques developed are still relevant in such a situation. Models for task recognition and plan recognition have been developed and tested on scenarios where the plans can be interwoven. Potential targets are people in the first stages of Alzheimer’s disease and in the structuring of the library of kernel plan sequences, typical routines used to sustain meaningful activity have been used. Evaluations have been carried out using volunteers conducting activities of daily life in an experimental home environment. The results generated from the sensors have been interpreted and analysis of developed algorithms has been made. The outcomes and findings of these experiments demonstrate that the developed hierarchal framework is capable of carrying activity recognition as well as being able to carry out intention analysis, e.g. predicting what activity they are most likely to carry out next

    Inferring Complex Activities for Context-aware Systems within Smart Environments

    Get PDF
    The rising ageing population worldwide and the prevalence of age-related conditions such as physical fragility, mental impairments and chronic diseases have significantly impacted the quality of life and caused a shortage of health and care services. Over-stretched healthcare providers are leading to a paradigm shift in public healthcare provisioning. Thus, Ambient Assisted Living (AAL) using Smart Homes (SH) technologies has been rigorously investigated to help address the aforementioned problems. Human Activity Recognition (HAR) is a critical component in AAL systems which enables applications such as just-in-time assistance, behaviour analysis, anomalies detection and emergency notifications. This thesis is aimed at investigating challenges faced in accurately recognising Activities of Daily Living (ADLs) performed by single or multiple inhabitants within smart environments. Specifically, this thesis explores five complementary research challenges in HAR. The first study contributes to knowledge by developing a semantic-enabled data segmentation approach with user-preferences. The second study takes the segmented set of sensor data to investigate and recognise human ADLs at multi-granular action level; coarse- and fine-grained action level. At the coarse-grained actions level, semantic relationships between the sensor, object and ADLs are deduced, whereas, at fine-grained action level, object usage at the satisfactory threshold with the evidence fused from multimodal sensor data is leveraged to verify the intended actions. Moreover, due to imprecise/vague interpretations of multimodal sensors and data fusion challenges, fuzzy set theory and fuzzy web ontology language (fuzzy-OWL) are leveraged. The third study focuses on incorporating uncertainties caused in HAR due to factors such as technological failure, object malfunction, and human errors. Hence, existing studies uncertainty theories and approaches are analysed and based on the findings, probabilistic ontology (PR-OWL) based HAR approach is proposed. The fourth study extends the first three studies to distinguish activities conducted by more than one inhabitant in a shared smart environment with the use of discriminative sensor-based techniques and time-series pattern analysis. The final study investigates in a suitable system architecture with a real-time smart environment tailored to AAL system and proposes microservices architecture with sensor-based off-the-shelf and bespoke sensing methods. The initial semantic-enabled data segmentation study was evaluated with 100% and 97.8% accuracy to segment sensor events under single and mixed activities scenarios. However, the average classification time taken to segment each sensor events have suffered from 3971ms and 62183ms for single and mixed activities scenarios, respectively. The second study to detect fine-grained-level user actions was evaluated with 30 and 153 fuzzy rules to detect two fine-grained movements with a pre-collected dataset from the real-time smart environment. The result of the second study indicate good average accuracy of 83.33% and 100% but with the high average duration of 24648ms and 105318ms, and posing further challenges for the scalability of fusion rule creations. The third study was evaluated by incorporating PR-OWL ontology with ADL ontologies and Semantic-Sensor-Network (SSN) ontology to define four types of uncertainties presented in the kitchen-based activity. The fourth study illustrated a case study to extended single-user AR to multi-user AR by combining RFID tags and fingerprint sensors discriminative sensors to identify and associate user actions with the aid of time-series analysis. The last study responds to the computations and performance requirements for the four studies by analysing and proposing microservices-based system architecture for AAL system. A future research investigation towards adopting fog/edge computing paradigms from cloud computing is discussed for higher availability, reduced network traffic/energy, cost, and creating a decentralised system. As a result of the five studies, this thesis develops a knowledge-driven framework to estimate and recognise multi-user activities at fine-grained level user actions. This framework integrates three complementary ontologies to conceptualise factual, fuzzy and uncertainties in the environment/ADLs, time-series analysis and discriminative sensing environment. Moreover, a distributed software architecture, multimodal sensor-based hardware prototypes, and other supportive utility tools such as simulator and synthetic ADL data generator for the experimentation were developed to support the evaluation of the proposed approaches. The distributed system is platform-independent and currently supported by an Android mobile application and web-browser based client interfaces for retrieving information such as live sensor events and HAR results

    From Activity Recognition to Intention Recognition for Assisted Living Within Smart Homes

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.The global population is aging; projections show that by 2050, more than 20% of the population will be aged over 64. This will lead to an increase in aging related illness, a decrease in informal support, and ultimately issues with providing care for these individuals. Assistive smart homes provide a promising solution to some of these issues. Nevertheless, they currently have issues hindering their adoption. To help address some of these issues, this study introduces a novel approach to implementing assistive smart homes. The devised approach is based upon an intention recognition mechanism incorporated into an intelligent agent architecture. This approach is detailed and evaluated. Evaluation was performed across three scenarios. Scenario 1 involved a web interface, focusing on testing the intention recognition mechanism. Scenarios 2 and 3 involved retrofitting a home with sensors and providing assistance with activities over a period of 3 months. The average accuracy for these three scenarios was 100%, 64.4%, and 83.3%, respectively. Future will extend and further evaluate this approach by implementing advanced sensor-filtering rules and evaluating more complex activities

    Goal Lifecycles and Ontological Models for Intention Based Assistive Living within Smart Environments

    Get PDF
    Current ambient assistive living solutions have adopted a traditional sensor-centric approach, involving data analysis and activity recognition to provide assistance to individuals. The reliance on sensors and activity recognition in this approach introduces issues with scalability and ability to model activity variations. This study introduces a novel approach to assistive living which intends to address these issues via a paradigm shift from a sensor centric approach to a goal-oriented one. The goal-oriented approach focuses on identification of user goals in order to pro-actively offer assistance by either pre-defined or dynamically constructed instructions. This paper introduces the architecture of this goal-oriented approach and describes an ontological goal model to serve as its basis. The use of this approach is illustrated in a case study which focuses on assisting a user with activities of daily living

    Ontology-based Activity Recognition Framework and Services

    Get PDF
    This paper introduces an ontology-based integrated framework for activity modeling, activity recognition and activity model evolution. Central to the framework is ontological activity modeling and semantic-based activity recognition, which is supported by an iterative process that incrementally improves the completeness and accuracy of activity models. In addition, the paper presents a service-oriented architecture for the realization of the proposed framework which can provide activity context-aware services in a scalable distributed manner. The paper further describes and discusses the implementation and testing experience of the framework and services in the context of smart home based assistive living
    corecore