24,143 research outputs found
âEyes freeâ in-car assistance: parent and child passenger collaboration during phone calls
This paper examines routine family car journeys, looking specifically at how passengers assist during a mobile telephone call while the drivers address the competing demands of handling the vehicle, interacting with various artefacts and controls in the cabin, and engage in co-located and remote conversations while navigating through busy city roads. Based on an analysis of video fragments, we see how drivers and child passengers form their conversations and requests around the call so as to be meaningful and paced to the demands, knowledge and abilities of their cooccupants, and how the conditions of the road and emergent traffic are oriented to and negotiated in the context of the social interaction that they exist alongside. The study provides implications for the design of car-based collaborative media and considers how hands- and eyesfree natural interfaces could be tailored to the complexity of activities in the car and on the road
Evaluating the development of wearable devices, personal data assistants and the use of other mobile devices in further and higher education institutions
This report presents technical evaluation and case studies of the use of wearable and mobile computing mobile devices in further and higher education. The first section provides technical evaluation of the current state of the art in wearable and mobile technologies and reviews several innovative wearable products that have been developed in recent years. The second section examines three scenarios for further and higher education where wearable and mobile devices are currently being used. The three scenarios include: (i) the delivery of lectures over mobile devices, (ii) the augmentation of the physical campus with a virtual and mobile component, and (iii) the use of PDAs and mobile devices in field studies. The first scenario explores the use of web lectures including an evaluation of IBM's Web Lecture Services and 3Com's learning assistant. The second scenario explores models for a campus without walls evaluating the Handsprings to Learning projects at East Carolina University and ActiveCampus at the University of California San Diego . The third scenario explores the use of wearable and mobile devices for field trips examining San Francisco Exploratorium's tool for capturing museum visits and the Cybertracker field computer. The third section of the report explores the uses and purposes for wearable and mobile devices in tertiary education, identifying key trends and issues to be considered when piloting the use of these devices in educational contexts
Expressive haptics for enhanced usability of mobile interfaces in situations of impairments
Designing for situational awareness could lead to better solutions for
disabled people, likewise, exploring the needs of disabled people could lead to
innovations that can address situational impairments. This in turn can create
non-stigmatising assistive technology for disabled people from which eventually
everyone could benefit. In this paper, we investigate the potential for
advanced haptics to compliment the graphical user interface of mobile devices,
thereby enhancing user experiences of all people in some situations (e.g.
sunlight interfering with interaction) and visually impaired people. We explore
technical solutions to this problem space and demonstrate our justification for
a focus on the creation of kinaesthetic force feedback. We propose initial
design concepts and studies, with a view to co-create delightful and expressive
haptic interactions with potential users motivated by scenarios of situational
and permanent impairments.Comment: Presented at the CHI'19 Workshop: Addressing the Challenges of
Situationally-Induced Impairments and Disabilities in Mobile Interaction,
2019 (arXiv:1904.05382
Recommended from our members
Mobile assistive technologies for the visually impaired
There are around 285 million visually impaired people worldwide, and around 370,000 people are registered as blind or partially sighted in the UK. Ongoing advances in information technology (IT) are increasing the scope for IT-based mobile assistive technologies to facilitate the independence, safety, and improved quality of life of the visually impaired. Research is being directed at making mobile phones and other handheld devices accessible via our haptic (touch) and audio sensory channels. We review research and innovation within the field of mobile assistive technology for the visually impaired and, in so doing, highlight the need for successful collaboration between clinical expertise, computer science, and domain users to realize fully the potential benefits of such technologies. We initially reflect on research that has been conducted to make mobile phones more accessible to people with vision loss. We then discuss innovative assistive applications designed for the visually impaired that are either delivered via mainstream devices and can be used while in motion (e.g., mobile phones) or are embedded within an environment that may be in motion (e.g., public transport) or within which the user may be in motion (e.g., smart homes)
Semi-aural Interfaces: Investigating Voice-controlled Aural Flows
To support mobile, eyes-free web browsing, users can listen to âplaylistsâ of web contentâ aural flows . Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe (e.g. while walking). This research extends the interaction with aural flows through simulated voice commands as a way to reduce visual interaction. This paper presents the findings of a study with 20 participants who browsed aural flows either through a visual interface only or by augmenting it with voice commands. Results suggest that using voice commands reduced the time spent looking at the device by half but yielded similar system usability and cognitive effort ratings as using buttons. Overall, the low-cognitive effort engendered by aural flows, regardless of the interaction modality, allowed participants to do more non-instructed (e.g. looking at the surrounding environment) than instructed activities (e.g. focusing on the user interface)
Design and semantics of form and movement (DeSForM 2006)
Design and Semantics of Form and Movement (DeSForM) grew from applied research exploring emerging design methods and practices to support new generation product and interface design. The products and interfaces are concerned with: the context of ubiquitous computing and ambient technologies and the need for greater empathy in the pre-programmed behaviour of the âmachinesâ that populate our lives. Such explorative research in the CfDR has been led by Young, supported by Kyffin, Visiting Professor from Philips Design and sponsored by Philips Design over a period of four years (research funding ÂŁ87k). DeSForM1 was the first of a series of three conferences that enable the presentation and debate of international work within this field: ⢠1st European conference on Design and Semantics of Form and Movement (DeSForM1), Baltic, Gateshead, 2005, Feijs L., Kyffin S. & Young R.A. eds. ⢠2nd European conference on Design and Semantics of Form and Movement (DeSForM2), Evoluon, Eindhoven, 2006, Feijs L., Kyffin S. & Young R.A. eds. ⢠3rd European conference on Design and Semantics of Form and Movement (DeSForM3), New Design School Building, Newcastle, 2007, Feijs L., Kyffin S. & Young R.A. eds. Philips sponsorship of practice-based enquiry led to research by three teams of research students over three years and on-going sponsorship of research through the Northumbria University Design and Innovation Laboratory (nuDIL). Young has been invited on the steering panel of the UK Thinking Digital Conference concerning the latest developments in digital and media technologies. Informed by this research is the work of PhD student Yukie Nakano who examines new technologies in relation to eco-design textiles
- âŚ