164 research outputs found

    Analysis of the hands in egocentric vision: A survey

    Full text link
    Egocentric vision (a.k.a. first-person vision - FPV) applications have thrived over the past few years, thanks to the availability of affordable wearable cameras and large annotated datasets. The position of the wearable camera (usually mounted on the head) allows recording exactly what the camera wearers have in front of them, in particular hands and manipulated objects. This intrinsic advantage enables the study of the hands from multiple perspectives: localizing hands and their parts within the images; understanding what actions and activities the hands are involved in; and developing human-computer interfaces that rely on hand gestures. In this survey, we review the literature that focuses on the hands using egocentric vision, categorizing the existing approaches into: localization (where are the hands or parts of them?); interpretation (what are the hands doing?); and application (e.g., systems that used egocentric hand cues for solving a specific problem). Moreover, a list of the most prominent datasets with hand-based annotations is provided

    An Effective and Efficient Method for Detecting Hands in Egocentric Videos for Rehabilitation Applications

    Full text link
    Objective: Individuals with spinal cord injury (SCI) report upper limb function as their top recovery priority. To accurately represent the true impact of new interventions on patient function and independence, evaluation should occur in a natural setting. Wearable cameras can be used to monitor hand function at home, using computer vision to automatically analyze the resulting videos (egocentric video). A key step in this process, hand detection, is difficult to do robustly and reliably, hindering deployment of a complete monitoring system in the home and community. We propose an accurate and efficient hand detection method that uses a simple combination of existing detection and tracking algorithms. Methods: Detection, tracking, and combination methods were evaluated on a new hand detection dataset, consisting of 167,622 frames of egocentric videos collected on 17 individuals with SCI performing activities of daily living in a home simulation laboratory. Results: The F1-scores for the best detector and tracker alone (SSD and Median Flow) were 0.90±\pm0.07 and 0.42±\pm0.18, respectively. The best combination method, in which a detector was used to initialize and reset a tracker, resulted in an F1-score of 0.87±\pm0.07 while being two times faster than the fastest detector alone. Conclusion: The combination of the fastest detector and best tracker improved the accuracy over online trackers while improving the speed of detectors. Significance: The method proposed here, in combination with wearable cameras, will help clinicians directly measure hand function in a patient's daily life at home, enabling independence after SCI.Comment: 7 pages, 3 figures, 5 table

    Detecting Hands in Egocentric Videos: Towards Action Recognition

    Full text link
    Recently, there has been a growing interest in analyzing human daily activities from data collected by wearable cameras. Since the hands are involved in a vast set of daily tasks, detecting hands in egocentric images is an important step towards the recognition of a variety of egocentric actions. However, besides extreme illumination changes in egocentric images, hand detection is not a trivial task because of the intrinsic large variability of hand appearance. We propose a hand detector that exploits skin modeling for fast hand proposal generation and Convolutional Neural Networks for hand recognition. We tested our method on UNIGE-HANDS dataset and we showed that the proposed approach achieves competitive hand detection results

    Designing an Egocentric Video-Based Dashboard to Report Hand Performance Measures for Outpatient Rehabilitation of Cervical Spinal Cord Injury

    Get PDF
    Background: Functional use of the upper extremities (UEs) is a top recovery priority for individuals with cervical spinal cord injury (cSCI), but the inability to monitor recovery at home and limitations in hand function outcome measures impede optimal recovery. Objectives: We developed a framework using wearable cameras to monitor hand use at home and aimed to identify the best way to report information to clinicians. Methods: A dashboard was iteratively developed with clinician (n = 7) input through focus groups and interviews, creating low-fidelity prototypes based on recurring feedback until no new information emerged. Affinity diagramming was used to identify themes and subthemes from interview data. User stories were developed and mapped to specific features to create a high-fidelity prototype. Results: Useful elements identified for a dashboard reporting hand performance included summaries to interpret graphs, a breakdown of hand posture and activity to provide context, video snippets to qualitatively view hand use at home, patient notes to understand patient satisfaction or struggles, and time series graphing of metrics to measure trends over time. Conclusion: Involving end-users in the design process and breaking down user requirements into user stories helped identify necessary interface elements for reporting hand performance metrics to clinicians. Clinicians recognized the dashboard's potential to monitor rehabilitation progress, provide feedback on hand use, and track progress over time. Concerns were raised about the implementation into clinical practice, therefore further inquiry is needed to determine the tool's feasibility and usefulness in clinical practice for individuals with UE impairments

    Are You "Tilting at Windmills" or Undertaking a Valid Clinical Trial?

    Get PDF
    In this review, several aspects surrounding the choice of a therapeutic intervention and the conduct of clinical trials are discussed. Some of the background for why human studies have evolved to their current state is also included. Specifically, the following questions have been addressed: 1) What criteria should be used to determine whether a scientific discovery or invention is worthy of translation to human application? 2) What recent scientific advance warrants a deeper understanding of clinical trials by everyone? 3) What are the different types and phases of a clinical trial? 4) What characteristics of a human disorder should be noted, tracked, or stratified for a clinical trial and what inclusion /exclusion criteria are important to enrolling appropriate trial subjects? 5) What are the different study designs that can be used in a clinical trial program? 6) What confounding factors can alter the accurate interpretation of clinical trial outcomes? 7) What are the success rates of clinical trials and what can we learn from previous clinical trials? 8) What are the essential principles for the conduct of valid clinical trials

    Hand contour detection in wearable camera video using an adaptive histogram region of interest

    Get PDF
    BACKGROUND: Monitoring hand function at home is needed to better evaluate the effectiveness of rehabilitation interventions. Our objective is to develop wearable computer vision systems for hand function monitoring. The specific aim of this study is to develop an algorithm that can identify hand contours in video from a wearable camera that records the user’s point of view, without the need for markers. METHODS: The two-step image processing approach for each frame consists of: (1) Detecting a hand in the image, and choosing one seed point that lies within the hand. This step is based on a priori models of skin colour. (2) Identifying the contour of the region containing the seed point. This is accomplished by adaptively determining, for each frame, the region within a colour histogram that corresponds to hand colours, and backprojecting the image using the reduced histogram. RESULTS: In four test videos relevant to activities of daily living, the hand detector classification accuracy was 88.3%. The contour detection results were compared to manually traced contours in 97 test frames, and the median F-score was 0.86. CONCLUSION: This algorithm will form the basis for a wearable computer-vision system that can monitor and log the interactions of the hand with its environment

    Tutorial: A guide to techniques for analysing recordings from the peripheral nervous system

    Get PDF
    The nervous system, through a combination of conscious and automatic processes, enables the regulation of the body and its interactions with the environment. The peripheral nervous system is an excellent target for technologies that seek to modulate, restore or enhance these abilities as it carries sensory and motor information that most directly relates to a target organ or function. However, many applications require a combination of both an effective peripheral nerve interface and effective signal processing techniques to provide selective and stable recordings. While there are many reviews on the design of peripheral nerve interfaces, reviews of data analysis techniques and translational considerations are limited. Thus, this tutorial aims to support new and existing researchers in the understanding of the general guiding principles, and introduces a taxonomy for electrode configurations, techniques and translational models to consider

    A Fast EEG Forecasting Algorithm for Phase-Locked Transcranial Electrical Stimulation of the Human Brain

    Get PDF
    A growing body of research suggests that non-invasive electrical brain stimulation can more effectively modulate neural activity when phase-locked to the underlying brain rhythms. Transcranial alternating current stimulation (tACS) can potentially stimulate the brain in-phase to its natural oscillations as recorded by electroencephalography (EEG), but matching these oscillations is a challenging problem due to the complex and time-varying nature of the EEG signals. Here we address this challenge by developing and testing a novel approach intended to deliver tACS phase-locked to the activity of the underlying brain region in real-time. This novel approach extracts phase and frequency from a segment of EEG, then forecasts the signal to control the stimulation. A careful tuning of the EEG segment length and prediction horizon is required and has been investigated here for different EEG frequency bands. The algorithm was tested on EEG data from 5 healthy volunteers. Algorithm performance was quantified in terms of phase-locking values across a variety of EEG frequency bands. Phase-locking performance was found to be consistent across individuals and recording locations. With current parameters, the algorithm performs best when tracking oscillations in the alpha band (8–13 Hz), with a phase-locking value of 0.77 ± 0.08. Performance was maximized when the frequency band of interest had a dominant frequency that was stable over time. The algorithm performs faster, and provides better phase-locked stimulation, compared to other recently published algorithms devised for this purpose. The algorithm is suitable for use in future studies of phase-locked tACS in preclinical and clinical applications

    Measuring hand use in the home after cervical spinal cord injury using egocentric video

    Full text link
    Background: Egocentric video has recently emerged as a potential solution for monitoring hand function in individuals living with tetraplegia in the community, especially for its ability to detect functional use in the home environment. Objective: To develop and validate a wearable vision-based system for measuring hand use in the home among individuals living with tetraplegia. Methods: Several deep learning algorithms for detecting functional hand-object interactions were developed and compared. The most accurate algorithm was used to extract measures of hand function from 65 hours of unscripted video recorded at home by 20 participants with tetraplegia. These measures were: the percentage of interaction time over total recording time (Perc); the average duration of individual interactions (Dur); the number of interactions per hour (Num). To demonstrate the clinical validity of the technology, egocentric measures were correlated with validated clinical assessments of hand function and independence (Graded Redefined Assessment of Strength, Sensibility and Prehension - GRASSP, Upper Extremity Motor Score - UEMS, and Spinal Cord Independent Measure - SCIM). Results: Hand-object interactions were automatically detected with a median F1-score of 0.80 (0.67-0.87). Our results demonstrated that higher UEMS and better prehension were related to greater time spent interacting, whereas higher SCIM and better hand sensation resulted in a higher number of interactions performed during the egocentric video recordings. Conclusions: For the first time, measures of hand function automatically estimated in an unconstrained environment in individuals with tetraplegia have been validated against internationally accepted measures of hand function. Future work will necessitate a formal evaluation of the reliability and responsiveness of the egocentric-based performance measures for hand use
    • 

    corecore