13 research outputs found
Personalizing Applications to Influence Health-Related Behaviour: An Exploration of Differences in Motivation (31)
To support health-related behaviour changes, consumers may use technologies such as smartphones, smartbands, sensors and other devices connected to the Internet of Things. Research has shown that personalising the interaction, including the interface, data, and feedback, can result in more effective outcomes in terms of the desired changes in behaviour. This paper reports on a pilot study that tested a smartphone step challenge application that was personalised based on the userâs motivational style using the Behavioural Inhibition System/Behavioural Approach System (BIS/BAS) scales of Reinforcement Sensitivity Theory. The results indicated that participation in the step challenge did change the behaviour of the participants. For half the days of the challenge, the application delivered pep talks tailored to the two motivational styles and to the participantâs behaviour (taking more or fewer steps than on the previous day). While the study found that participants with different motivational styles responded differently to the motivational cues (pep talks), their responses did not appear to be influenced by the personalisation of the pep talks
Feasibility Study of Ubiquitous Interaction Concepts
AbstractThere are all sorts of consumer electronics in a home environment. Using âappsâ to interact with each device is neither feasible nor practical in an ubicomp future. Prototyping and evaluating interaction concepts for this future is a challenge. This paper proposes four concepts for device discovery and device interaction implemented in a virtual environment. The interaction concepts were compared in a controlled experiment for evaluation and comparison.Some statistically significant differences and subjective preferences could be observed in the quantitative and qualitative data respectively.Overall, the results indicate that the proposed interaction concepts were found natural and easy to use
D3.1 User expectations and cross-modal interaction
This document is deliverable D3.1 âUser expectations and cross-modal in-teractionâ and presents user studies to understand expectations and reac-tions to content presentation methods for mobile AR applications and rec-ommendations to realize an interface and interaction design in accordance with user needs or disabilities
Personal Shopping Assistance and Navigator System for Visually Impaired People
International audienceIn this paper, a personal assistant and navigator system for visually impaired people will be described. The showcase presented in-tends to demonstrate how partially sighted people could be aided by the technology in performing an ordinary activity, like going to a mall and moving inside it to find a specific product. We propose an Android ap-plication that integrates Pedestrian Dead Reckoning and Computer Vi-sion algorithms, using an off-the-shelf Smartphone connected to a Smart-watch. The detection, recognition and pose estimation of specific objects or features in the scene derive an estimate of user location with sub-meter accuracy when combined with a hardware-sensor pedometer. The pro-posed prototype interfaces with a user by means of Augmented Reality, exploring a variety of sensorial modalities other than just visual overlay, namely audio and haptic modalities, to create a seamless immersive user experience. The interface and interaction of the preliminary platform have been studied through specific evaluation methods. The feedback gathered will be taken into consideration to further improve the pro-posed system
WozARd: A Wizard of Oz Tool for Mobile AR Author Keywords
Abstract Wizard of Oz methodology is useful when conducting user studies of a system that is in early development. It is essential to be able to simulate part of the system and to collect feedback from potential users. Using a human to act as the system is one way to do this. The Wizard of Oz tool presented here is called WozARd and it aims at offering a set of tools that help the test leader control the visual, tactile and auditive output that is presented to the test participant. Additionally, it is suitable for using in an augmented reality environment where images are overlaid on the phone's camera view or on glasses. The main features that were identified as necessary include presentation of media such as images, video and sound, navigation and location based triggering, automatically taking photos, capability to log test results and visual feedback, and the integration of Sony SmartWatch for interaction possibilities
WozARd: A Wizard of Oz Method for Wearable Augmented Reality InteractionâA Pilot Study
Head-mounted displays and other wearable devices open up for innovative types of interaction for wearable augmented reality (AR). However, to design and evaluate these new types of AR user interfaces, it is essential to quickly simulate undeveloped components of the system and collect feedback from potential users early in the design process. One way of doing this is the wizard of Oz (WOZ) method. The basic idea behind WOZ is to create the illusion of a working system by having a human operator, performing some or all of the systemâs functions. WozARd is a WOZ method developed for wearable AR interaction. The presented pilot study was an initial investigation of the capability of the WozARd method to simulate an AR city tour. Qualitative and quantitative data were collected from 21 participants performing a simulated AR city tour. The data analysis focused on seven categories that can have an impact on how the WozARd method is perceived by participants: precision, relevance, responsiveness, technical stability, visual fidelity, general user-experience, and human-operator performance. Overall, the results indicate that the participants perceived the simulated AR city tour as a relatively realistic experience despite a certain degree of technical instability and human-operator mistakes
WozARd: a wizard of oz tool for mobile AR
Wizard of Oz methodology is useful when conducting user studies of a system that is in early development. It is essential to be able to simulate part of the system and to collect feedback from potential users. Using a human to act as the system is one way to do this. The Wizard of Oz tool presented here is called WozARd and it aims at offering a set of tools that help the test leader control the visual, tactile and auditive output that is presented to the test participant. Additionally, it is suitable for using in an augmented reality environment where images are overlaid on the phone's camera view or on glasses. The main features that were identified as necessary include presentation of media such as images, video and sound, navigation and location based triggering, automatically taking photos, capability to log test results and visual feedback, and the integration of Sony SmartWatch for interaction possibilities
Personalising Applications to Influence Health-Related Behaviour : An Exploration of Differences in Motivation
To support health-related behaviour changes, consumers may use technologies such as smartphones, smartbands, sensors and other devices connected to the Internet of Things. Research has shown that personalising the interaction, including the interface, data, and feedback, can result in more effective outcomes in terms of the desired changes in behaviour. This paper reports on a pilot study that tested a smartphone step challenge application that was personalised based on the userâs motivational style using the Behavioural Inhibition System/Behavioural Approach System (BIS/BAS) scales of Reinforcement Sensitivity Theory. The results indicated that participation in the step challenge did change the behaviour of the participants. For half the days of the challenge, the application delivered pep talks tailored to the two motivational styles and to the participantâs behaviour (taking more or fewer steps than on the previous day). While the study found that participants with different motivational styles responded differently to the motivational cues (pep talks), their responses did not appear to be influenced by the personalisation of the pep talks
A Prototyping Method to Simulate Wearable Augmented Reality Interaction in a Virtual Environment - A Pilot Study
Recently, we have seen an intensified development of head mounted displays (HMD). Some observers believe that the HMD form factor facilitates Augmented Reality (AR) technology, a technology that mixes virtual content with the users' view of the world around them. One of many interesting use cases that illustrate this is a smart home in which a user can interact with consumer electronic devices through a wearable AR system. Building prototypes of such wearable AR systems can be difficult and costly, since it involves a number of different devices and systems with varying technological readiness level. The ideal prototyping method for this should offer high fidelity at a relatively low cost and the ability to simulate a wide range of wearable AR use cases. This paper presents a proposed method, called IVAR (Immersive Virtual AR), for prototyping wearable AR interaction in a virtual environment (VE). IVAR was developed in an iterative design process that resulted in a testable setup in terms of hardware and software. Additionally, a basic pilot experiment was conducted to explore what it means to collect quantitative and qualitative data with the proposed prototyping method. The main contribution is that IVAR shows potential to become a useful wearable AR prototyping method, but that several challenges remain before meaningful data can be produced in controlled experiments. In particular, tracking technology needs to improve, both with regards to intrusiveness and precision