5,284 research outputs found

    From ‘hands up’ to ‘hands on’: harnessing the kinaesthetic potential of educational gaming

    Get PDF
    Traditional approaches to distance learning and the student learning journey have focused on closing the gap between the experience of off-campus students and their on-campus peers. While many initiatives have sought to embed a sense of community, create virtual learning environments and even build collaborative spaces for team-based assessment and presentations, they are limited by technological innovation in terms of the types of learning styles they support and develop. Mainstream gaming development – such as with the Xbox Kinect and Nintendo Wii – have a strong element of kinaesthetic learning from early attempts to simulate impact, recoil, velocity and other environmental factors to the more sophisticated movement-based games which create a sense of almost total immersion and allow untethered (in a technical sense) interaction with the games’ objects, characters and other players. Likewise, gamification of learning has become a critical focus for the engagement of learners and its commercialisation, especially through products such as the Wii Fit. As this technology matures, there are strong opportunities for universities to utilise gaming consoles to embed levels of kinaesthetic learning into the student experience – a learning style which has been largely neglected in the distance education sector. This paper will explore the potential impact of these technologies, to broadly imagine the possibilities for future innovation in higher education

    Effects of sensory cueing in virtual motor rehabilitation. A review.

    Get PDF
    Objectives To critically identify studies that evaluate the effects of cueing in virtual motor rehabilitation in patients having different neurological disorders and to make recommendations for future studies. Methods Data from MEDLINE®, IEEExplore, Science Direct, Cochrane library and Web of Science was searched until February 2015. We included studies that investigate the effects of cueing in virtual motor rehabilitation related to interventions for upper or lower extremities using auditory, visual, and tactile cues on motor performance in non-immersive, semi-immersive, or fully immersive virtual environments. These studies compared virtual cueing with an alternative or no intervention. Results Ten studies with a total number of 153 patients were included in the review. All of them refer to the impact of cueing in virtual motor rehabilitation, regardless of the pathological condition. After selecting the articles, the following variables were extracted: year of publication, sample size, study design, type of cueing, intervention procedures, outcome measures, and main findings. The outcome evaluation was done at baseline and end of the treatment in most of the studies. All of studies except one showed improvements in some or all outcomes after intervention, or, in some cases, in favor of the virtual rehabilitation group compared to the control group. Conclusions Virtual cueing seems to be a promising approach to improve motor learning, providing a channel for non-pharmacological therapeutic intervention in different neurological disorders. However, further studies using larger and more homogeneous groups of patients are required to confirm these findings

    Natural Walking in Virtual Reality:A Review

    Get PDF

    A novel approach to user controlled ambulation of lower extremity exoskeletons using admittance control paradigm

    Get PDF
    The robotic lower extremity exoskeletons address the ambulatory problems confronting individuals with paraplegia. Paraplegia due to spinal cord injury (SCI) can cause motor deficit to the lower extremities leading to inability to walk. Though wheelchairs provide mobility to the user, they do not provide support to all activities of everyday living to individuals with paraplegia. Current research is addressing the issue of ambulation through the use of wearable exoskeletons that are pre-programmed. There are currently four exoskeletons in the U.S. market: Ekso, Rewalk, REX and Indego. All of the currently available exoskeletons have 2 active Degrees of Freedom (DOF) except for REX which has 5 active DOF. All of them have pre-programmed gait giving the user the ability to initiate a gait but not the ability to control the stride amplitude (height), stride frequency or stride length, and hence restricting users’ ability to navigate across different surfaces and obstacles that are commonly encountered in the community. Most current exoskeletons do not have motors for abduction or adduction to provide users with the option for movement in coronal plane, hence restricting user’s ability to effectively use the exoskeletons. These limitations of currently available pre-programmed exoskeleton models are sought to be overcome by an intuitive, real time user-controlled control mechanism employing admittance control by using hand-trajectory as a surrogate for foot trajectory. Preliminary study included subjects controlling the trajectory of the foot in a virtual environment using their contralateral hand. The study proved that hands could produce trajectories similar to human foot trajectories when provided with haptic and visual feedback. A 10 DOF 1/2 scale biped robot was built to test the control paradigm. The robot has 5 DOF on each leg with 2 DOF at the hip to provide flexion/extension and abduction/adduction, 1 DOF at the knee to provide flexion and 2 DOF at the ankle to provide flexion/extension and inversion/eversion. The control mechanism translates the trajectory of each hand into the trajectory of the ipsilateral foot in real time, thus providing the user with the ability to control each leg in both sagittal and coronal planes using the admittance control paradigm. The efficiency of the control mechanism was evaluated in a study using healthy subjects controlling the robot on a treadmill. A trekking pole was attached to each foot of the biped. The subjects controlled the trajectory of the foot of the biped by applying small forces in the direction of the required movement to the trekking pole through a force sensor. The algorithm converted the forces to Cartesian position of the foot in real time using admittance control; the Cartesian position was converted to joint angles of the hip and knee using inverse kinematics. The kinematics, synchrony and smoothness of the trajectory produced by the biped robot was evaluated at different speeds, with and without obstacles, and compared with typical walking by human subjects on the treadmill. Further, the cognitive load required to control the biped on the treadmill was evaluated and the effect of speed and obstacles with cognitive load on the kinematics, synchrony and smoothness was analyzed

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Designing for Ballet Classes: Identifying and Mitigating Communication Challenges Between Dancers and Teachers

    Get PDF
    Dancer-teacher communication in a ballet class can be challenging: ballet is one of the most complex forms of movements, and learning happens through multi-faceted interactions with studio tools (mirror, barre, and floor) and the teacher. We conducted an interview-based qualitative study with seven ballet teachers and six dancers followed by an open-coded analysis to explore the communication challenges that arise while teaching and learning in the ballet studio. We identified key communication issues, including adapting to multi-level dancer expertise, transmitting and realigning development goals, providing personalized corrections and feedback, maintaining the state of flow, and communicating how to properly use tools in the environment. We discuss design implications for crafting technological interventions aimed at mitigating these communication challenges

    Haptic induced motor learning and the extension of its benefits to stroke patients

    Get PDF
    In this research, the Haptic Master robotic arm and virtual environments are used to induce motor learning in subjects with no known musculoskeletal or neurological disorders. It is found in this research that both perception and performance of the subject are increased through the haptic and visual feedback delivered through the Haptic Master. These system benefits may be extended to enhance therapies for patients with loss of motor skills due to neurological disease or brain injury. Force and visual feedback were manipulated within virtual environment scenarios to facilitate learning. In one force feedback condition, the subject is required to maneuver a sphere through a haptic maze or linear channel. In the second feedback condition, the subject\u27s movement was stopped when the sphere came in contact with the haptic walls. To resume movement, the force vector had to be redirected towards the optimal trajectory. To analyze the efficiency of the various scenarios, the area between the optimal and actual trajectories was used as a measure of learning. The results from this research demonstrated that within more complex environments one type of force feedback was more successful in facilitating motor learning. In a simpler environment, two out of three subjects experienced a higher degree of motor learning with the same type of force feedback. Learning is not enhanced with the presence of visual feedback. Also, in nearly all studied cases, the primary limitation to learning is shoulder and attention fatigue brought on by the experimentation

    Interactive form creation: exploring the creation and manipulation of free form through the use of interactive multiple input interface

    Get PDF
    Most current CAD systems support only the two most common input devices: a mouse and a keyboard that impose a limit to the degree of interaction that a user can have with the system. However, it is not uncommon for users to work together on the same computer during a collaborative task. Beside that, people tend to use both hands to manipulate 3D objects; one hand is used to orient the object while the other hand is used to perform some operation on the object. The same things could be applied to computer modelling in the conceptual phase of the design process. A designer can rotate and position an object with one hand, and manipulate the shape [deform it] with the other hand. Accordingly, the 3D object can be easily and intuitively changed through interactive manipulation of both hands.The research investigates the manipulation and creation of free form geometries through the use of interactive interfaces with multiple input devices. First the creation of the 3D model will be discussed; several different types of models will be illustrated. Furthermore, different tools that allow the user to control the 3D model interactively will be presented. Three experiments were conducted using different interactive interfaces; two bi-manual techniques were compared with the conventional one-handed approach. Finally it will be demonstrated that the use of new and multiple input devices can offer many opportunities for form creation. The problem is that few, if any, systems make it easy for the user or the programmer to use new input devices

    From presence to consciousness through virtual reality

    Get PDF
    Immersive virtual environments can break the deep, everyday connection between where our senses tell us we are and where we are actually located and whom we are with. The concept of 'presence' refers to the phenomenon of behaving and feeling as if we are in the virtual world created by computer displays. In this article, we argue that presence is worthy of study by neuroscientists, and that it might aid the study of perception and consciousness
    corecore