285 research outputs found

    Heart rate monitoring, activity recognition, and recommendation for e-coaching

    Get PDF
    Equipped with hardware, such as accelerometer and heart rate sensor, wearables enable measuring physical activities and heart rate. However, the accuracy of these heart rate measurements is still unclear and the coupling with activity recognition is often missing in health apps. This study evaluates heart rate monitoring with four different device types: a specialized sports device with chest strap, a fitness tracker, a smart watch, and a smartphone using photoplethysmography. In a state of rest, similar measurement results are obtained with the four devices. During physical activities, the fitness tracker, smart watch, and smartphone measure sudden variations in heart rate with a delay, due to movements of the wrist. Moreover, this study showed that physical activities, such as squats and dumbbell curl, can be recognized with fitness trackers. By combining heart rate monitoring and activity recognition, personal suggestions for physical activities are generated using a tag-based recommender and rule-based filter

    Machine Learning Based Physical Activity Extraction for Unannotated Acceleration Data

    Get PDF
    Sensor based human activity recognition (HAR) is an emerging and challenging research area. The physical activity of people has been associated with many health benefits and even reducing the risk of different diseases. It is possible to collect sensor data related to physical activities of people with wearable devices and embedded sensors, for example in smartphones and smart environments. HAR has been successful in recognizing physical activities with machine learning methods. However, it is a critical challenge to annotate sensor data in HAR. Most existing approaches use supervised machine learning methods which means that true labels need be given to the data when training a machine learning model. Supervised deep learning methods have outperformed traditional machine learning methods in HAR but they require an even more extensive amount of data and true labels. In this thesis, machine learning methods are used to develop a solution that can recognize physical activity (e.g., walking and sedentary time) from unannotated acceleration data collected using a wearable accelerometer device. It is shown to be beneficial to collect and annotate data from physical activity of only one person. Supervised classifiers can be trained with small, labeled acceleration data and more training data can be obtained in a semi-supervised setting by leveraging knowledge from available unannotated data. The semi-supervised En-Co-Training method is used with the traditional supervised machine learning methods K-nearest Neighbor and Random Forest. Also, intensities of activities are produced by the cut point analysis of the OMGUI software as reference information and used to increase confidence of correctly selecting pseudo-labels that are added to the training data. A new metric is suggested to help to evaluate reliability when no true labels are available. It calculates a fraction of predictions that have a correct intensity out of all the predictions according to the cut point analysis of the OMGUI software. The reliability of the supervised KNN and RF classifiers reaches 88 % accuracy and the C-index value 0,93, while the accuracy of the K-means clustering remains 72 % when testing the models on labeled acceleration data. The initial supervised classifiers and the classifiers retrained in a semi-supervised setting are tested on unlabeled data collected from 12 people and measured with the new metric. The overall results improve from 96-98 % to 98-99 %. The results with more challenging activities to the initial classifiers, taking a walk improve from 55-81 % to 67-81 % and jogging from 0-95 % to 95-98 %. It is shown that the results of the KNN and RF classifiers consistently increase in the semi-supervised setting when tested on unannotated, real-life data of 12 people

    A fast and robust deep convolutional neural networks for complex human activity recognition using smartphone

    Get PDF
    © 2019 by the authors. Licensee MDPI, Basel, Switzerland. As a significant role in healthcare and sports applications, human activity recognition (HAR) techniques are capable of monitoring humans’ daily behavior. It has spurred the demand for intelligent sensors and has been giving rise to the explosive growth of wearable and mobile devices. They provide the most availability of human activity data (big data). Powerful algorithms are required to analyze these heterogeneous and high-dimension streaming data efficiently. This paper proposes a novel fast and robust deep convolutional neural network structure (FR-DCNN) for human activity recognition (HAR) using a smartphone. It enhances the effectiveness and extends the information of the collected raw data from the inertial measurement unit (IMU) sensors by integrating a series of signal processing algorithms and a signal selection module. It enables a fast computational method for building the DCNN classifier by adding a data compression module. Experimental results on the sampled 12 complex activities dataset show that the proposed FR-DCNN model is the best method for fast computation and high accuracy recognition. The FR-DCNN model only needs 0.0029 s to predict activity in an online way with 95.27% accuracy. Meanwhile, it only takes 88 s (average) to establish the DCNN classifier on the compressed dataset with less precision loss 94.18%

    Data processing of physiological sensor data and alarm determination utilising activity recognition

    Full text link
    Current physiological sensors are passive and transmit sensed data to Monitoring centre (MC) through wireless body area network (WBAN) without processing data intelligently. We propose a solution to discern data requestors for prioritising and inferring data to reduce transactions and conserve battery power, which is important requirements of mobile health (mHealth). However, there is a problem for alarm determination without knowing the activity of the user. For example, 170 beats per minute of heart rate can be normal during exercising, however an alarm should be raised if this figure has been sensed during sleep. To solve this problem, we suggest utilising the existing activity recognition (AR) applications. Most of health related wearable devices include accelerometers along with physiological sensors. This paper presents a novel approach and solution to utilise physiological data with AR so that they can provide not only improved and efficient services such as alarm determination but also provide richer health information which may provide content for new markets as well as additional application services such as converged mobile health with aged care services. This has been verified by experimented tests using vital signs such as heart pulse rate, respiration rate and body temperature with a demonstrated outcome of AR accelerometer sensors integrated with an Android app

    PARCIV: Recognizing physical activities having complex interclass variations using semantic data of smartphone

    Get PDF
    Smartphones are equipped with precise hardware sensors including accelerometer, gyroscope, and magnetometer. These devices provide real‐time semantic data that can be used to recognize daily life physical activities for personalized smart health assessment. Existing studies focus on the recognition of simple physical activities but they lacked in providing accurate recognition of physical activities having complex interclass variations. Therefore, this research focuses on the accurate recognition of physical activities having complex interclass variations. We propose a two‐layered approach called PARCIV that first clusters similar activities based on semantic data and then recognize them using a machine learning classifier. Our two‐layered approach first bounds the highly indistinguishable activities in clusters to avoid misclassification with other distinguishable activities and thereafter recognize them on a fine‐grained level within each cluster. To evaluate our approach, we make an android application that collects labeled data by using smartphone sensors from 10 participants, while performing activities. PARCIV recognizes distinguishable as well as indistinguishable activities with high accuracy of 99% on the self‐collected dataset. Furthermore, PARCIV achieve 95% accuracy on the publicly available dataset used by state‐of‐the‐art studies. PARCIV outperforms various state‐of‐the‐art studies by 8%‐17% for simple activities as well as complex activities

    A collaborative healthcare framework for shared healthcare plan with ambient intelligence

    Get PDF
    The fast propagation of the Internet of Things (IoT) devices has driven to the development of collaborative healthcare frameworks to support the next generation healthcare industry for quality medical healthcare. This paper presents a generalized collaborative framework named collaborative shared healthcare plan (CSHCP) for cognitive health and fitness assessment of people using ambient intelligent application and machine learning techniques. CSHCP provides support for daily physical activity recognition, monitoring, assessment and generate a shared healthcare plan based on collaboration among different stakeholders: doctors, patient guardians, as well as close community circles. The proposed framework shows promising outcomes compared to the existing studies. Furthermore, the proposed framework enhances team communication, coordination, long-term plan management of healthcare information to provide a more efficient and reliable shared healthcare plans to people

    Automatic detection of faults in race walking. A comparative analysis of machine-learning algorithms fed with inertial sensor data

    Get PDF
    The validity of results in race walking is often questioned due to subjective decisions in the detection of faults. This study aims to compare machine-learning algorithms fed with data gathered from inertial sensors placed on lower-limb segments to define the best-performing classifiers for the automatic detection of illegal steps. Eight race walkers were enrolled and linear accelerations and angular velocities related to pelvis, thighs, shanks, and feet were acquired by seven inertial sensors. The experimental protocol consisted of two repetitions of three laps of 250 m, one performed with regular race walking, one with loss-of-contact faults, and one with knee-bent faults. The performance of 108 classifiers was evaluated in terms of accuracy, recall, precision, F1-score, and goodness index. Generally, linear accelerations revealed themselves as more characteristic with respect to the angular velocities. Among classifiers, those based on the support vector machine (SVM) were the most accurate. In particular, the quadratic SVM fed with shank linear accelerations was the best-performing classifier, with an F1-score and a goodness index equal to 0.89 and 0.11, respectively. The results open the possibility of using a wearable device for automatic detection of faults in race walking competition
    • 

    corecore