2,249 research outputs found

    Fall prevention intervention technologies: A conceptual framework and survey of the state of the art

    Get PDF
    In recent years, an ever increasing range of technology-based applications have been developed with the goal of assisting in the delivery of more effective and efficient fall prevention interventions. Whilst there have been a number of studies that have surveyed technologies for a particular sub-domain of fall prevention, there is no existing research which surveys the full spectrum of falls prevention interventions and characterises the range of technologies that have augmented this landscape. This study presents a conceptual framework and survey of the state of the art of technology-based fall prevention systems which is derived from a systematic template analysis of studies presented in contemporary research literature. The framework proposes four broad categories of fall prevention intervention system: Pre-fall prevention; Post-fall prevention; Fall injury prevention; Cross-fall prevention. Other categories include, Application type, Technology deployment platform, Information sources, Deployment environment, User interface type, and Collaborative function. After presenting the conceptual framework, a detailed survey of the state of the art is presented as a function of the proposed framework. A number of research challenges emerge as a result of surveying the research literature, which include a need for: new systems that focus on overcoming extrinsic falls risk factors; systems that support the environmental risk assessment process; systems that enable patients and practitioners to develop more collaborative relationships and engage in shared decision making during falls risk assessment and prevention activities. In response to these challenges, recommendations and future research directions are proposed to overcome each respective challenge.The Royal Society, grant Ref: RG13082

    Integrated Smart Glove for Hand Motion Monitoring

    Get PDF

    Wearable Wireless Devices

    Get PDF
    No abstract available

    MOCA: A Low-Power, Low-Cost Motion Capture System Based on Integrated Accelerometers

    Get PDF
    Human-computer interaction (HCI) and virtual reality applications pose the challenge of enabling real-time interfaces for natural interaction. Gesture recognition based on body-mounted accelerometers has been proposed as a viable solution to translate patterns of movements that are associated with user commands, thus substituting point-and-click methods or other cumbersome input devices. On the other hand, cost and power constraints make the implementation of a natural and efficient interface suitable for consumer applications a critical task. Even though several gesture recognition solutions exist, their use in HCI context has been poorly characterized. For this reason, in this paper, we consider a low-cost/low-power wearable motion tracking system based on integrated accelerometers called motion capture with accelerometers (MOCA) that we evaluated for navigation in virtual spaces. Recognition is based on a geometric algorithm that enables efficient and robust detection of rotational movements. Our objective is to demonstrate that such a low-cost and a low-power implementation is suitable for HCI applications. To this purpose, we characterized the system from both a quantitative point of view and a qualitative point of view. First, we performed static and dynamic assessment of movement recognition accuracy. Second, we evaluated the effectiveness of user experience using a 3D game application as a test bed

    Novel smart glove technology as a biomechanical monitoring tool

    Get PDF
    Developments in Virtual Reality (VR) technology and its overall market have been occurring since the 1960s when Ivan Sutherland created the world’s first tracked head-mounted display (HMD) – a goggle type head gear. In society today, consumers are expecting a more immersive experience and associated tools to bridge the cyber-physical divide. This paper presents the development of a next generation smart glove microsystem to facilitate Human Computer Interaction through the integration of sensors, processors and wireless technology. The objective of the glove is to measure the range of hand joint movements, in real time and empirically in a quantitative manner. This includes accurate measurement of flexion, extension, adduction and abduction of the metacarpophalangeal (MCP), Proximal interphalangeal (PIP) and Distal interphalangeal (DIP) joints of the fingers and thumb in degrees, together with thumb-index web space movement. This system enables full real-time monitoring of complex hand movements. Commercially available gloves are not fitted with sufficient sensors for full data capture, and require calibration for each glove wearer. Unlike these current state-of-the-art data gloves, the UU / Tyndall Inertial Measurement Unit (IMU) glove uses a combination of novel stretchable substrate material and 9 degree of freedom (DOF) inertial sensors in conjunction with complex data analytics to detect joint movement. Our novel IMU data glove requires minimal calibration and is therefore particularly suited to multiple application domains such as Human Computer interfacing, Virtual reality, the healthcare environment

    Wearable Wireless Devices

    Get PDF
    No abstract available

    A study on virtual reality and developing the experience in a gaming simulation

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Masters by ResearchVirtual Reality (VR) is an experience where a person is provided with the freedom of viewing and moving in a virtual world [1]. The experience is not constrained to a limited control. Here, it was triggered interactively according to the user’s physical movement [1] [2]. So the user feels as if they are seeing the real world; also, 3D technologies allow the viewer to experience the volume of the object and its prospection in the virtual world [1]. The human brain generates the depth when each eye receives the images in its point of view. For learning for and developing the project using the university’s facilities, some of the core parts of the research have been accomplished, such as designing the VR motion controller and VR HMD (Head Mount Display), using an open source microcontroller. The VR HMD with the VR controller gives an immersive feel and a complete VR system [2]. The motive was to demonstrate a working model to create a VR experience on a mobile platform. Particularly, the VR system uses a micro electro-mechanical system to track motion without a tracking camera. The VR experience has also been developed in a gaming simulation. To produce this, Maya, Unity, Motion Analysis System, MotionBuilder, Arduino and programming have been used. The lessons and codes taken or improvised from [33] [44] [25] and [45] have been studied and implemented

    PhysioVR: a novel mobile virtual reality framework for physiological computing

    Get PDF
    Virtual Reality (VR) is morphing into a ubiquitous technology by leveraging of smartphones and screenless cases in order to provide highly immersive experiences at a low price point. The result of this shift in paradigm is now known as mobile VR (mVR). Although mVR offers numerous advantages over conventional immersive VR methods, one of the biggest limitations is related with the interaction pathways available for the mVR experiences. Using physiological computing principles, we created the PhysioVR framework, an Open-Source software tool developed to facilitate the integration of physiological signals measured through wearable devices in mVR applications. PhysioVR includes heart rate (HR) signals from Android wearables, electroencephalography (EEG) signals from a low cost brain computer interface and electromyography (EMG) signals from a wireless armband. The physiological sensors are connected with a smartphone via Bluetooth and the PhysioVR facilitates the streaming of the data using UDP communication protocol, thus allowing a multicast transmission for a third party application such as the Unity3D game engine. Furthermore, the framework provides a bidirectional communication with the VR content allowing an external event triggering using a real-time control as well as data recording options. We developed a demo game project called EmoCat Rescue which encourage players to modulate HR levels in order to successfully complete the in-game mission. EmoCat Rescue is included in the PhysioVR project which can be freely downloaded. This framework simplifies the acquisition, streaming and recording of multiple physiological signals and parameters from wearable consumer devices providing a single and efficient interface to create novel physiologically-responsive mVR applications.info:eu-repo/semantics/publishedVersio
    • …
    corecore