21,413 research outputs found

    The Evolution of First Person Vision Methods: A Survey

    Full text link
    The emergence of new wearable technologies such as action cameras and smart-glasses has increased the interest of computer vision scientists in the First Person perspective. Nowadays, this field is attracting attention and investments of companies aiming to develop commercial devices with First Person Vision recording capabilities. Due to this interest, an increasing demand of methods to process these videos, possibly in real-time, is expected. Current approaches present a particular combinations of different image features and quantitative methods to accomplish specific objectives like object detection, activity recognition, user machine interaction and so on. This paper summarizes the evolution of the state of the art in First Person Vision video analysis between 1997 and 2014, highlighting, among others, most commonly used features, methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart Glasses, Computer Vision, Video Analytics, Human-machine Interactio

    Human-activity-centered measurement system:challenges from laboratory to the real environment in assistive gait wearable robotics

    Get PDF
    Assistive gait wearable robots (AGWR) have shown a great advancement in developing intelligent devices to assist human in their activities of daily living (ADLs). The rapid technological advancement in sensory technology, actuators, materials and computational intelligence has sped up this development process towards more practical and smart AGWR. However, most assistive gait wearable robots are still confined to be controlled, assessed indoor and within laboratory environments, limiting any potential to provide a real assistance and rehabilitation required to humans in the real environments. The gait assessment parameters play an important role not only in evaluating the patient progress and assistive device performance but also in controlling smart self-adaptable AGWR in real-time. The self-adaptable wearable robots must interactively conform to the changing environments and between users to provide optimal functionality and comfort. This paper discusses the performance parameters, such as comfortability, safety, adaptability, and energy consumption, which are required for the development of an intelligent AGWR for outdoor environments. The challenges to measuring the parameters using current systems for data collection and analysis using vision capture and wearable sensors are presented and discussed

    Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review

    Get PDF
    Animals play a profoundly important and intricate role in our lives today. Dogs have been human companions for thousands of years, but they now work closely with us to assist the disabled, and in combat and search and rescue situations. Farm animals are a critical part of the global food supply chain, and there is increasing consumer interest in organically fed and humanely raised livestock, and how it impacts our health and environmental footprint. Wild animals are threatened with extinction by human induced factors, and shrinking and compromised habitat. This review sets the goal to systematically survey the existing literature in smart computing and sensing technologies for domestic, farm and wild animal welfare. We use the notion of \emph{animal welfare} in broad terms, to review the technologies for assessing whether animals are healthy, free of pain and suffering, and also positively stimulated in their environment. Also the notion of \emph{smart computing and sensing} is used in broad terms, to refer to computing and sensing systems that are not isolated but interconnected with communication networks, and capable of remote data collection, processing, exchange and analysis. We review smart technologies for domestic animals, indoor and outdoor animal farming, as well as animals in the wild and zoos. The findings of this review are expected to motivate future research and contribute to data, information and communication management as well as policy for animal welfare

    Designing Auditory Feedback from Wearable Weightlifting Devices

    Get PDF
    While wearable devices for fitness have gained broad popularity, most are focused on tracking general activity types rather than correcting exercise forms, which is extremely important for weightlifters. We interviewed 7 frequent gym-goers about their opinions and expectations for feedback from wearable devices for weightlifting. We describe their desired feedback, and how their expectations and concerns could be balanced in future wearable fitness technologies

    Can smartwatches replace smartphones for posture tracking?

    Get PDF
    This paper introduces a human posture tracking platform to identify the human postures of sitting, standing or lying down, based on a smartwatch. This work develops such a system as a proof-of-concept study to investigate a smartwatch's ability to be used in future remote health monitoring systems and applications. This work validates the smartwatches' ability to track the posture of users accurately in a laboratory setting while reducing the sampling rate to potentially improve battery life, the first steps in verifying that such a system would work in future clinical settings. The algorithm developed classifies the transitions between three posture states of sitting, standing and lying down, by identifying these transition movements, as well as other movements that might be mistaken for these transitions. The system is trained and developed on a Samsung Galaxy Gear smartwatch, and the algorithm was validated through a leave-one-subject-out cross-validation of 20 subjects. The system can identify the appropriate transitions at only 10 Hz with an F-score of 0.930, indicating its ability to effectively replace smart phones, if needed

    dWatch: a Personal Wrist Watch for Smart Environments

    Get PDF
    Intelligent environments, such as smart homes or domotic systems, have the potential to support people in many of their ordinary activities, by allowing complex control strategies for managing various capabilities of a house or a building: lights, doors, temperature, power and energy, music, etc. Such environments, typically, provide these control strategies by means of computers, touch screen panels, mobile phones, tablets, or In-House Displays. An unobtrusive and typically wearable device, like a bracelet or a wrist watch, that lets users perform various operations in their homes and to receive notifications from the environment, could strenghten the interaction with such systems, in particular for those people not accustomed to computer systems (e.g., elderly) or in contexts where they are not in front of a screen. Moreover, such wearable devices reduce the technological gap introduced in the environment by home automation systems, thus permitting a higher level of acceptance in the daily activities and improving the interaction between the environment and its inhabitants. In this paper, we introduce the dWatch, an off-the-shelf personal wearable notification and control device, integrated in an intelligent platform for domotic systems, designed to optimize the way people use the environment, and built as a wrist watch so that it is easily accessible, worn by people on a regular basis and unobtrusiv

    360 Quantified Self

    Get PDF
    Wearable devices with a wide range of sensors have contributed to the rise of the Quantified Self movement, where individuals log everything ranging from the number of steps they have taken, to their heart rate, to their sleeping patterns. Sensors do not, however, typically sense the social and ambient environment of the users, such as general life style attributes or information about their social network. This means that the users themselves, and the medical practitioners, privy to the wearable sensor data, only have a narrow view of the individual, limited mainly to certain aspects of their physical condition. In this paper we describe a number of use cases for how social media can be used to complement the check-up data and those from sensors to gain a more holistic view on individuals' health, a perspective we call the 360 Quantified Self. Health-related information can be obtained from sources as diverse as food photo sharing, location check-ins, or profile pictures. Additionally, information from a person's ego network can shed light on the social dimension of wellbeing which is widely acknowledged to be of utmost importance, even though they are currently rarely used for medical diagnosis. We articulate a long-term vision describing the desirable list of technical advances and variety of data to achieve an integrated system encompassing Electronic Health Records (EHR), data from wearable devices, alongside information derived from social media data.Comment: QCRI Technical Repor
    • …
    corecore