268 research outputs found
Detecting Hands in Egocentric Videos: Towards Action Recognition
Recently, there has been a growing interest in analyzing human daily
activities from data collected by wearable cameras. Since the hands are
involved in a vast set of daily tasks, detecting hands in egocentric images is
an important step towards the recognition of a variety of egocentric actions.
However, besides extreme illumination changes in egocentric images, hand
detection is not a trivial task because of the intrinsic large variability of
hand appearance. We propose a hand detector that exploits skin modeling for
fast hand proposal generation and Convolutional Neural Networks for hand
recognition. We tested our method on UNIGE-HANDS dataset and we showed that the
proposed approach achieves competitive hand detection results
Analysis of the hands in egocentric vision: A survey
Egocentric vision (a.k.a. first-person vision - FPV) applications have
thrived over the past few years, thanks to the availability of affordable
wearable cameras and large annotated datasets. The position of the wearable
camera (usually mounted on the head) allows recording exactly what the camera
wearers have in front of them, in particular hands and manipulated objects.
This intrinsic advantage enables the study of the hands from multiple
perspectives: localizing hands and their parts within the images; understanding
what actions and activities the hands are involved in; and developing
human-computer interfaces that rely on hand gestures. In this survey, we review
the literature that focuses on the hands using egocentric vision, categorizing
the existing approaches into: localization (where are the hands or parts of
them?); interpretation (what are the hands doing?); and application (e.g.,
systems that used egocentric hand cues for solving a specific problem).
Moreover, a list of the most prominent datasets with hand-based annotations is
provided
An Effective and Efficient Method for Detecting Hands in Egocentric Videos for Rehabilitation Applications
Objective: Individuals with spinal cord injury (SCI) report upper limb
function as their top recovery priority. To accurately represent the true
impact of new interventions on patient function and independence, evaluation
should occur in a natural setting. Wearable cameras can be used to monitor hand
function at home, using computer vision to automatically analyze the resulting
videos (egocentric video). A key step in this process, hand detection, is
difficult to do robustly and reliably, hindering deployment of a complete
monitoring system in the home and community. We propose an accurate and
efficient hand detection method that uses a simple combination of existing
detection and tracking algorithms. Methods: Detection, tracking, and
combination methods were evaluated on a new hand detection dataset, consisting
of 167,622 frames of egocentric videos collected on 17 individuals with SCI
performing activities of daily living in a home simulation laboratory. Results:
The F1-scores for the best detector and tracker alone (SSD and Median Flow)
were 0.900.07 and 0.420.18, respectively. The best combination
method, in which a detector was used to initialize and reset a tracker,
resulted in an F1-score of 0.870.07 while being two times faster than the
fastest detector alone. Conclusion: The combination of the fastest detector and
best tracker improved the accuracy over online trackers while improving the
speed of detectors. Significance: The method proposed here, in combination with
wearable cameras, will help clinicians directly measure hand function in a
patient's daily life at home, enabling independence after SCI.Comment: 7 pages, 3 figures, 5 table
Recognizing Hand Use and Hand Role at Home After Stroke from Egocentric Video
Introduction: Hand function is a central determinant of independence after
stroke. Measuring hand use in the home environment is necessary to evaluate the
impact of new interventions, and calls for novel wearable technologies.
Egocentric video can capture hand-object interactions in context, as well as
show how more-affected hands are used during bilateral tasks (for stabilization
or manipulation). Automated methods are required to extract this information.
Objective: To use artificial intelligence-based computer vision to classify
hand use and hand role from egocentric videos recorded at home after stroke.
Methods: Twenty-one stroke survivors participated in the study. A random forest
classifier, a SlowFast neural network, and the Hand Object Detector neural
network were applied to identify hand use and hand role at home.
Leave-One-Subject-Out-Cross-Validation (LOSOCV) was used to evaluate the
performance of the three models. Between-group differences of the models were
calculated based on the Mathews correlation coefficient (MCC). Results: For
hand use detection, the Hand Object Detector had significantly higher
performance than the other models. The macro average MCCs using this model in
the LOSOCV were 0.50 +- 0.23 for the more-affected hands and 0.58 +- 0.18 for
the less-affected hands. Hand role classification had macro average MCCs in the
LOSOCV that were close to zero for all models. Conclusion: Using egocentric
video to capture the hand use of stroke survivors at home is feasible. Pose
estimation to track finger movements may be beneficial to classifying hand
roles in the future.Comment: Appendix is include
Hierarchical Compliance Control of a Soft Ankle Rehabilitation Robot Actuated by Pneumatic Muscles
Traditional compliance control of a rehabilitation robot is implemented in task space by using impedance or admittance control algorithms. The soft robot actuated by pneumatic muscle actuators (PMAs) is becoming prominent for patients as it enables the compliance being adjusted in each active link, which, however, has not been reported in the literature. This paper proposes a new compliance control method of a soft ankle rehabilitation robot that is driven by four PMAs configured in parallel to enable three degrees of freedom movement of the ankle joint. A new hierarchical compliance control structure, including a low-level compliance adjustment controller in joint space and a high-level admittance controller in task space, is designed. An adaptive compliance control paradigm is further developed by taking into account patient’s active contribution and movement ability during a previous period of time, in order to provide robot assistance only when it is necessarily required. Experiments on healthy and impaired human subjects were conducted to verify the adaptive hierarchical compliance control scheme. The results show that the robot hierarchical compliance can be online adjusted according to the participant’s assessment. The robot reduces its assistance output when participants contribute more and vice versa, thus providing a potentially feasible solution to the patient-in-loop cooperative training strateg
Hand contour detection in wearable camera video using an adaptive histogram region of interest
BACKGROUND: Monitoring hand function at home is needed to better evaluate the effectiveness of rehabilitation interventions. Our objective is to develop wearable computer vision systems for hand function monitoring. The specific aim of this study is to develop an algorithm that can identify hand contours in video from a wearable camera that records the user’s point of view, without the need for markers. METHODS: The two-step image processing approach for each frame consists of: (1) Detecting a hand in the image, and choosing one seed point that lies within the hand. This step is based on a priori models of skin colour. (2) Identifying the contour of the region containing the seed point. This is accomplished by adaptively determining, for each frame, the region within a colour histogram that corresponds to hand colours, and backprojecting the image using the reduced histogram. RESULTS: In four test videos relevant to activities of daily living, the hand detector classification accuracy was 88.3%. The contour detection results were compared to manually traced contours in 97 test frames, and the median F-score was 0.86. CONCLUSION: This algorithm will form the basis for a wearable computer-vision system that can monitor and log the interactions of the hand with its environment
Designing an Egocentric Video-Based Dashboard to Report Hand Performance Measures for Outpatient Rehabilitation of Cervical Spinal Cord Injury
Background: Functional use of the upper extremities (UEs) is a top recovery priority for individuals with cervical spinal cord injury (cSCI), but the inability to monitor recovery at home and limitations in hand function outcome measures impede optimal recovery. Objectives: We developed a framework using wearable cameras to monitor hand use at home and aimed to identify the best way to report information to clinicians. Methods: A dashboard was iteratively developed with clinician (n = 7) input through focus groups and interviews, creating low-fidelity prototypes based on recurring feedback until no new information emerged. Affinity diagramming was used to identify themes and subthemes from interview data. User stories were developed and mapped to specific features to create a high-fidelity prototype. Results: Useful elements identified for a dashboard reporting hand performance included summaries to interpret graphs, a breakdown of hand posture and activity to provide context, video snippets to qualitatively view hand use at home, patient notes to understand patient satisfaction or struggles, and time series graphing of metrics to measure trends over time. Conclusion: Involving end-users in the design process and breaking down user requirements into user stories helped identify necessary interface elements for reporting hand performance metrics to clinicians. Clinicians recognized the dashboard's potential to monitor rehabilitation progress, provide feedback on hand use, and track progress over time. Concerns were raised about the implementation into clinical practice, therefore further inquiry is needed to determine the tool's feasibility and usefulness in clinical practice for individuals with UE impairments
Quantitative assessment based on kinematic measures of functional impairments during upper extremity movements: a review
Quantitative measures of human movement quality are important for discriminating healthy and pathological conditions and for expressing the outcomes and clinically important changes in subjects' functional state. However the most frequently used instruments for the upper extremity functional assessment are clinical scales, that previously have been standardized and validated, but have a high subjective component depending on the observer who scores the test. But they are not enough to assess motor strategies used during movements, and their use in combination with other more objective measures is necessary. The objective of the present review is to provide an overview on objective metrics found in literature with the aim of quantifying the upper extremity performance during functional tasks, regardless of the equipment or system used for registering kinematic data
Are You "Tilting at Windmills" or Undertaking a Valid Clinical Trial?
In this review, several aspects surrounding the choice of a therapeutic intervention and the conduct of clinical trials are discussed. Some of the background for why human studies have evolved to their current state is also included. Specifically, the following questions have been addressed: 1) What criteria should be used to determine whether a scientific discovery or invention is worthy of translation to human application? 2) What recent scientific advance warrants a deeper understanding of clinical trials by everyone? 3) What are the different types and phases of a clinical trial? 4) What characteristics of a human disorder should be noted, tracked, or stratified for a clinical trial and what inclusion /exclusion criteria are important to enrolling appropriate trial subjects? 5) What are the different study designs that can be used in a clinical trial program? 6) What confounding factors can alter the accurate interpretation of clinical trial outcomes? 7) What are the success rates of clinical trials and what can we learn from previous clinical trials? 8) What are the essential principles for the conduct of valid clinical trials
A comparison of selective recording approaches in the peripheral nervous system using extraneural electrodes
- …
