652 research outputs found

    How a Diverse Research Ecosystem Has Generated New Rehabilitation Technologies: Review of NIDILRR’s Rehabilitation Engineering Research Centers

    Get PDF
    Over 50 million United States citizens (1 in 6 people in the US) have a developmental, acquired, or degenerative disability. The average US citizen can expect to live 20% of his or her life with a disability. Rehabilitation technologies play a major role in improving the quality of life for people with a disability, yet widespread and highly challenging needs remain. Within the US, a major effort aimed at the creation and evaluation of rehabilitation technology has been the Rehabilitation Engineering Research Centers (RERCs) sponsored by the National Institute on Disability, Independent Living, and Rehabilitation Research. As envisioned at their conception by a panel of the National Academy of Science in 1970, these centers were intended to take a “total approach to rehabilitation”, combining medicine, engineering, and related science, to improve the quality of life of individuals with a disability. Here, we review the scope, achievements, and ongoing projects of an unbiased sample of 19 currently active or recently terminated RERCs. Specifically, for each center, we briefly explain the needs it targets, summarize key historical advances, identify emerging innovations, and consider future directions. Our assessment from this review is that the RERC program indeed involves a multidisciplinary approach, with 36 professional fields involved, although 70% of research and development staff are in engineering fields, 23% in clinical fields, and only 7% in basic science fields; significantly, 11% of the professional staff have a disability related to their research. We observe that the RERC program has substantially diversified the scope of its work since the 1970’s, addressing more types of disabilities using more technologies, and, in particular, often now focusing on information technologies. RERC work also now often views users as integrated into an interdependent society through technologies that both people with and without disabilities co-use (such as the internet, wireless communication, and architecture). In addition, RERC research has evolved to view users as able at improving outcomes through learning, exercise, and plasticity (rather than being static), which can be optimally timed. We provide examples of rehabilitation technology innovation produced by the RERCs that illustrate this increasingly diversifying scope and evolving perspective. We conclude by discussing growth opportunities and possible future directions of the RERC program

    Validation of the GUESS-18: A Short Version of the Game User Experience Satisfaction Scale (GUESS)

    Get PDF
    The Game User Experience Satisfaction Scale (GUESS) is a 55-item tool assessing nine constructs describing video game satisfaction. While the development of the GUESS followed best practices and resulted in a versatile, comprehensive tool for assessing video game user experience, responding to 55 items can be cumbersome in situations where repeated assessments are necessary. The aim of this research was to develop a shorter version of the scale for use in iterative game design, testing, and research. Two studies were conducted: the first one to create a configural model of the GUESS that was then truncated to an 18-item short scale to establish an initial level of validity and a second study with a new sample to demonstrate cross-sample validity of the 18- item GUESS scale. Results from a confirmatory factor analysis of the 18-item scale demonstrated excellent fit and construct validity to the original nine construct instrument. Use of the GUESS-18 is encouraged as a brief, practical, yet comprehensive measure of video game satisfaction for practitioners and researchers

    Almost Human: The Study of Physical Processes and the Performance of a Prosthetic Digital Spine

    Get PDF
    Almost Human is an investigation of interdisciplinary performance through music that looks to the self to try to further understand subjective performance practices in expression, gesture and sonic output. This text presents experimental methods of examining and creating music through kinaesthetic and electronic-assisted means within instrumental, dance and interactive works. The extraction of affective, performative and sonic properties from these works aids in unlocking the relationship between the choreographic, physical and conceptual object. The first part of the text explores and illustrates multimodal approaches to analysing, capturing, measuring and archiving the moving musician and dancer in an assortment of performative settings. It focuses on a series of works for solo cello, as well as interdisciplinary pieces which positions movement and embodied expressivity at the forefront of the discussion. The second part is dedicated to the aesthetic, conceptual and utilitarian content of a new interactive work for cellist/mover, and a prosthetic digital spine. Here, relationships are combined to showcase the permeability of the body, as well as its expressive content. The conceptual object, The Spine, serves as a generator to help expand musical and artistic possibilities. Its inclusion in the work aids in refocusing my relationship to movement and sound for creation and performance, but also aesthetically, it adds to the growing canon of experimental ventures in conceptualising expressivity. Beyond the text, the portfolio of Almost Human includes an auditory and visual chronicle of the process between the years 2012-14, which is used to assist the reader in further understanding the performative practice and findings

    Inferring Static Hand Poses from a Low-Cost Non-Intrusive sEMG Sensor

    Get PDF
    Every year, a significant number of people lose a body part in an accident, through sickness or in high-risk manual jobs. Several studies and research works have tried to reduce the constraints and risks in their lives through the use of technology. This work proposes a learning-based approach that performs gesture recognition using a surface electromyography-based device, the Myo Armband released by Thalmic Labs, which is a commercial device and has eight non-intrusive low-cost sensors. With 35 able-bodied subjects, and using the Myo Armband device, which is able to record data at about 200 MHz, we collected a dataset that includes six dissimilar hand gestures. We used a gated recurrent unit network to train a system that, as input, takes raw signals extracted from the surface electromyography sensors. The proposed approach obtained a 99.90% training accuracy and 99.75% validation accuracy. We also evaluated the proposed system on a test set (new subjects) obtaining an accuracy of 77.85%. In addition, we showed the test prediction results for each gesture separately and analyzed which gestures for the Myo armband with our suggested network can be difficult to distinguish accurately. Moreover, we studied for first time the gated recurrent unit network capability in gesture recognition approaches. Finally, we integrated our method in a system that is able to classify live hand gestures.This work was supported by the Spanish Government TIN2016-76515R grant, supported with Feder funds. It has also been funded by the University of Alicante project GRE16-19, by the Valencian Government project GV/2018/022, and by a Spanish grant for PhD studies ACIF/2017/243

    The cyber-guitar system: a study in technologically enabled performance practice

    Get PDF
    A thesis submitted to the Faculty of Humanities, University of the Witwatersrand, in fulfilment of the requirements for the degree of Doctor of Philosophy, March 2017This thesis documents the development and realisation of an augmented instrument, expressed through the processes of artistic practice as research. The research project set out to extend my own creative practice on the guitar by technologically enabling and extending the instrument. This process was supported by a number of creative outcomes (performances, compositions and recordings), running parallel to the interrogation of theoretical areas emerging out of the research. In the introduction I present a timeline for the project and situate the work in the field of artistic practice as research, explaining relationship between the traditional and creative practices. Following on from this chapter one, Notation, Improvisation and the Cyber-Guitar System discusses the impact of notation on my own education as a musician, unpacking how the nature of notation impacted on improvisation both historically and within my own creative work. Analysis of fields such as graphic notation led to the creation of the composition Hymnus Caesus Obcessiones, a central work in this research. In chapter two, Noise, Music and the Creative Boundary I consider the boundary and relationship between noise and music, beginning with the futurist composer Luigi Russolo. The construction of the augmented instrument was informed by this boundary and aimed to bring the lens onto this in my own practice, recognising what I have termed the ephemeral noise boundary. I argue that the boundary line between them yields the most fertile place of sonic and technological engagement. Chapter three focuses on the instrumental development and a new understanding of organology. It locates an understanding of the position of the musical instrument historically with reference to the values emerging from the studies of notation and noise. It also considers the impacts of technology and gestural interfacing. Chapter four documents the physical process of designing and building the guitar. Included in the Appendix are three CDs and a live DVD of the various performances undertaken across the years of research.XL201

    Haptic Media Scenes

    Get PDF
    The aim of this thesis is to apply new media phenomenological and enactive embodied cognition approaches to explain the role of haptic sensitivity and communication in personal computer environments for productivity. Prior theory has given little attention to the role of haptic senses in influencing cognitive processes, and do not frame the richness of haptic communication in interaction design—as haptic interactivity in HCI has historically tended to be designed and analyzed from a perspective on communication as transmissions, sending and receiving haptic signals. The haptic sense may not only mediate contact confirmation and affirmation, but also rich semiotic and affective messages—yet this is a strong contrast between this inherent ability of haptic perception, and current day support for such haptic communication interfaces. I therefore ask: How do the haptic senses (touch and proprioception) impact our cognitive faculty when mediated through digital and sensor technologies? How may these insights be employed in interface design to facilitate rich haptic communication? To answer these questions, I use theoretical close readings that embrace two research fields, new media phenomenology and enactive embodied cognition. The theoretical discussion is supported by neuroscientific evidence, and tested empirically through case studies centered on digital art. I use these insights to develop the concept of the haptic figura, an analytical tool to frame the communicative qualities of haptic media. The concept gauges rich machine- mediated haptic interactivity and communication in systems with a material solution supporting active haptic perception, and the mediation of semiotic and affective messages that are understood and felt. As such the concept may function as a design tool for developers, but also for media critics evaluating haptic media. The tool is used to frame a discussion on opportunities and shortcomings of haptic interfaces for productivity, differentiating between media systems for the hand and the full body. The significance of this investigation is demonstrating that haptic communication is an underutilized element in personal computer environments for productivity and providing an analytical framework for a more nuanced understanding of haptic communication as enabling the mediation of a range of semiotic and affective messages, beyond notification and confirmation interactivity

    A musculoskeletal model of the human hand to improve human-device interaction

    Get PDF
    abstract: Multi-touch tablets and smart phones are now widely used in both workplace and consumer settings. Interacting with these devices requires hand and arm movements that are potentially complex and poorly understood. Experimental studies have revealed differences in performance that could potentially be associated with injury risk. However, underlying causes for performance differences are often difficult to identify. For example, many patterns of muscle activity can potentially result in similar behavioral output. Muscle activity is one factor contributing to forces in tissues that could contribute to injury. However, experimental measurements of muscle activity and force for humans are extremely challenging. Models of the musculoskeletal system can be used to make specific estimates of neuromuscular coordination and musculoskeletal forces. However, existing models cannot easily be used to describe complex, multi-finger gestures such as those used for multi-touch human computer interaction (HCI) tasks. We therefore seek to develop a dynamic musculoskeletal simulation capable of estimating internal musculoskeletal loading during multi-touch tasks involving multi digits of the hand, and use the simulation to better understand complex multi-touch and gestural movements, and potentially guide the design of technologies the reduce injury risk. To accomplish these, we focused on three specific tasks. First, we aimed at determining the optimal index finger muscle attachment points within the context of the established, validated OpenSim arm model using measured moment arm data taken from the literature. Second, we aimed at deriving moment arm values from experimentally-measured muscle attachments and using these values to determine muscle-tendon paths for both extrinsic and intrinsic muscles of middle, ring and little fingers. Finally, we aimed at exploring differences in hand muscle activation patterns during zooming and rotating tasks on the tablet computer in twelve subjects. Towards this end, our musculoskeletal hand model will help better address the neuromuscular coordination, safe gesture performance and internal loadings for multi-touch applications.Dissertation/ThesisDoctoral Dissertation Mechanical Engineering 201

    Immunitary Gaming: Mapping the First-Person Shooter

    Get PDF
    Videogames have been theorised as an action-based medium. The original contribution to knowledge this thesis makes is to reconfigure this claim by considering popular multiplayer FPS games as reaction-based – particularly, immune reactions. I take up Roberto Esposito’s claim that the individual in contemporary biopolitics is defined negatively against the other, controlled and ultimately negated via their reactions to power’s capacity to incessantly generate threats. By inciting insecurity and self-protective gestures, FPS games like Activision’s Call of Duty franchise and EA’s Battlefield series vividly dramatise Esposito’s thought, producing an immunitary gaming. Immunitary Gaming locates the FPS within key moments of change as well as evolution in Western image systems including the emergence of linear perspective, cartography and the early years of the cinema. The FPS appropriates these image systems, but also alters their politics. Giorgio Agamben has argued that the apparatuses of late modernity no longer subjectify like their forebears, but desubjectify the individual, producing an impotent neoliberal body politic. I trace a similar development here. My work also seeks to capture the player’s movements via autoethnographic writing that communicates the viscerally and intensity of the experience. The FPS is framed as capable of giving insight into both the present and the future of our technological and political milieu and ‘sensorium,’ in Walter Benjamin’s terms. In its valorisation of the individual and production of insecurity to incite action, this project argues that the FPS is a symbolic form of immunitary neoliberal governmentality

    Eyes-free tongue gesture and tongue joystick control of a five DOF upper-limb exoskeleton for severely disabled individuals

    Get PDF
    Spinal cord injury can leave the affected individual severely disabled with a low level of independence and quality of life. Assistive upper-limb exoskeletons are one of the solutions that can enable an individual with tetraplegia (paralysis in both arms and legs) to perform simple activities of daily living by mobilizing the arm. Providing an efficient user interface that can provide full continuous control of such a device—safely and intuitively—with multiple degrees of freedom (DOFs) still remains a challenge. In this study, a control interface for an assistive upper-limb exoskeleton with five DOFs based on an intraoral tongue-computer interface (ITCI) for individuals with tetraplegia was proposed. Furthermore, we evaluated eyes-free use of the ITCI for the first time and compared two tongue-operated control methods, one based on tongue gestures and the other based on dynamic virtual buttons and a joystick-like control. Ten able-bodied participants tongue controlled the exoskeleton for a drinking task with and without visual feedback on a screen in three experimental sessions. As a baseline, the participants performed the drinking task with a standard gamepad. The results showed that it was possible to control the exoskeleton with the tongue even without visual feedback and to perform the drinking task at 65.1% of the speed of the gamepad. In a clinical case study, an individual with tetraplegia further succeeded to fully control the exoskeleton and perform the drinking task only 5.6% slower than the able-bodied group. This study demonstrated the first single-modal control interface that can enable individuals with complete tetraplegia to fully and continuously control a five-DOF upper limb exoskeleton and perform a drinking task after only 2 h of training. The interface was used both with and without visual feedback
    corecore