519 research outputs found

    Stratified Transfer Learning for Cross-domain Activity Recognition

    Full text link
    In activity recognition, it is often expensive and time-consuming to acquire sufficient activity labels. To solve this problem, transfer learning leverages the labeled samples from the source domain to annotate the target domain which has few or none labels. Existing approaches typically consider learning a global domain shift while ignoring the intra-affinity between classes, which will hinder the performance of the algorithms. In this paper, we propose a novel and general cross-domain learning framework that can exploit the intra-affinity of classes to perform intra-class knowledge transfer. The proposed framework, referred to as Stratified Transfer Learning (STL), can dramatically improve the classification accuracy for cross-domain activity recognition. Specifically, STL first obtains pseudo labels for the target domain via majority voting technique. Then, it performs intra-class knowledge transfer iteratively to transform both domains into the same subspaces. Finally, the labels of target domain are obtained via the second annotation. To evaluate the performance of STL, we conduct comprehensive experiments on three large public activity recognition datasets~(i.e. OPPORTUNITY, PAMAP2, and UCI DSADS), which demonstrates that STL significantly outperforms other state-of-the-art methods w.r.t. classification accuracy (improvement of 7.68%). Furthermore, we extensively investigate the performance of STL across different degrees of similarities and activity levels between domains. And we also discuss the potential of STL in other pervasive computing applications to provide empirical experience for future research.Comment: 10 pages; accepted by IEEE PerCom 2018; full paper. (camera-ready version

    Cross-position Activity Recognition with Stratified Transfer Learning

    Full text link
    Human activity recognition aims to recognize the activities of daily living by utilizing the sensors on different body parts. However, when the labeled data from a certain body position (i.e. target domain) is missing, how to leverage the data from other positions (i.e. source domain) to help learn the activity labels of this position? When there are several source domains available, it is often difficult to select the most similar source domain to the target domain. With the selected source domain, we need to perform accurate knowledge transfer between domains. Existing methods only learn the global distance between domains while ignoring the local property. In this paper, we propose a \textit{Stratified Transfer Learning} (STL) framework to perform both source domain selection and knowledge transfer. STL is based on our proposed \textit{Stratified} distance to capture the local property of domains. STL consists of two components: Stratified Domain Selection (STL-SDS) can select the most similar source domain to the target domain; Stratified Activity Transfer (STL-SAT) is able to perform accurate knowledge transfer. Extensive experiments on three public activity recognition datasets demonstrate the superiority of STL. Furthermore, we extensively investigate the performance of transfer learning across different degrees of similarities and activity levels between domains. We also discuss the potential applications of STL in other fields of pervasive computing for future research.Comment: Submit to Pervasive and Mobile Computing as an extension to PerCom 18 paper; First revision. arXiv admin note: substantial text overlap with arXiv:1801.0082
    • …
    corecore