6,577 research outputs found

    Editorial: Teaching and learning between virtuality and reality

    Get PDF
    Discussions about the digitisation of the professional and working world assume that virtual teaching and learning will increasingly merge with real teaching and learning which will result in new, innovative technology-based teaching and learning approaches. Against this background, this article addresses technology-based realms of experience and considers the challenges associated with the use of virtual teaching and learning environments in vocational education and training.Keywords: Virtual reality, professionalisation, teacher training, digitalisatio

    Shifting the Focus: The Role of Presence in Reconceptualising the Design Process

    Get PDF
    In this paper the relationship between presence and imaging is examined with the view to establish how our understanding of imaging, and subsequently the design process, may be reconceptualised to give greater focus to its experiential potential. First, the paper outlines the research project contributing to the discussion. Then, it provides brief overviews of research on both imaging and presence in the process highlighting the narrow conceptions of imaging (and the recognition of the need for further research) compared to the more holistic and experiential understandings of presence. The paper concludes with an argument and proposed study for exploring the role of digital technology and presence in extending the potential of imaging and its role in the design process. As indicated in the DRS Conference Theme, this paper focuses “…on what people experience and the systems and actions that create those experiences.” Interface designers, information architects and interactive media artists understand the powerful influence of experience in design. ‘Experience design’ is a community of practice driven by individuals within digital based disciplines where the belief is that understanding people is essential to any successful design in any medium and that “…experience is the personal connection with the moment and… every aspect of living is an experience, whether we are the creators or simply chance participants” (Shedroff, 2001, p. 5). Keywords: Design, Design Process, Presence, Imaging, Grounded Theory</p

    Designing technologies for playful interspecies communication

    Get PDF
    This one-day workshop examines how we might use technologies to support design for playful interspecies communication and considers some of the potential implications. Here we explore aspects of playful technology and reflect on what opportunities computers can provide for facilitating communication between species. The workshop's focal activity will be the co-creation of some theoretical systems designed for specific multi-species scenarios. Through our activities, we aim to pave the way for designing technology that promotes interspecies communication, drawing input not only from ACI practitioners but also from those of the broader HCI and animal science community, who may be stakeholders in facilitating, expanding, and/or redefining playful technology

    Primate drum kit: A system for studying acoustic pattern production by non-human primates using acceleration and strain sensors

    Get PDF
    The possibility of achieving experimentally controlled, non-vocal acoustic production in non-human primates is a key step to enable the testing of a number of hypotheses on primate behavior and cognition. However, no device or solution is currently available, with the use of sensors in non-human animals being almost exclusively devoted to applications in food industry and animal surveillance. Specifically, no device exists which simultaneously allows: (i) spontaneous production of sound or music by non-human animals via object manipulation, (ii) systematical recording of data sensed from these movements, (iii) the possibility to alter the acoustic feedback properties of the object using remote control. We present two prototypes we developed for application with chimpanzees (Pan troglodytes) which, while fulfilling the aforementioned requirements, allow to arbitrarily associate sounds to physical object movements. The prototypes differ in sensing technology, costs, intended use and construction requirements. One prototype uses four piezoelectric elements embedded between layers of Plexiglas and foam. Strain data is sent to a computer running Python through an Arduino board. A second prototype consists in a modified Wii Remote contained in a gum toy. Acceleration data is sent via Bluetooth to a computer running Max/MSP. We successfully pilot tested the first device with a group of chimpanzees. We foresee using these devices for a range of cognitive experiments. © 2013 by the authors; licensee MDPI, Basel, Switzerland

    Perceptual Issues Improve Haptic Systems Performance

    Get PDF

    More playful user interfaces:interfaces that invite social and physical interaction

    Get PDF

    3D gaze cursor: continuous calibration and end-point grasp control of robotic actuators

    No full text
    © 2016 IEEE.Eye movements are closely related to motor actions, and hence can be used to infer motor intentions. Additionally, eye movements are in some cases the only means of communication and interaction with the environment for paralysed and impaired patients with severe motor deficiencies. Despite this, eye-tracking technology still has a very limited use as a human-robot control interface and its applicability is highly restricted to 2D simple tasks that operate on screen based interfaces and do not suffice for natural physical interaction with the environment. We propose that decoding the gaze position in 3D space rather than in 2D results into a much richer spatial cursor signal that allows users to perform everyday tasks such as grasping and moving objects via gaze-based robotic teleoperation. Eye tracking in 3D calibration is usually slow - we demonstrate here that by using a full 3D trajectory for system calibration generated by a robotic arm rather than a simple grid of discrete points, gaze calibration in the 3 dimensions can be successfully achieved in short time and with high accuracy. We perform the non-linear regression from eye-image to 3D-end point using Gaussian Process regressors, which allows us to handle uncertainty in end-point estimates gracefully. Our telerobotic system uses a multi-joint robot arm with a gripper and is integrated with our in-house GT3D binocular eye tracker. This prototype system has been evaluated and assessed in a test environment with 7 users, yielding gaze-estimation errors of less than 1cm in the horizontal, vertical and depth dimensions, and less than 2cm in the overall 3D Euclidean space. Users reported intuitive, low-cognitive load, control of the system right from their first trial and were straightaway able to simply look at an object and command through a wink to grasp this object with the robot gripper

    Psychophysiological Assessment Of Fear Experience In Response To Sound During Computer Video Gameplay

    Get PDF
    corecore