2 research outputs found

    Exploiting fashion x-commerce through the empowerment of voice in the fashion virtual reality arena. Integrating voice assistant and virtual reality technologies for fashion communication

    Get PDF
    The ongoing development of eXtended Reality (XR) technologies is supporting a rapid increase of their performances along with a progressive decrease of their costs, making them more and more attractive for a large class of consumers. As a result, their widespread use is expected within the next few years. This may foster new opportunities for e-commerce strategies, giving birth to an XR-based commerce (x-commerce) ecosystem. With respect to web and mobile-based shopping experiences, x-commerce could more easily support brick-and-mortar store-like experiences. One interesting and consolidated one amounts to the interactions among customers and shop assistants inside fashion stores. In this work, we concentrate on such aspects with the design and implementation of an XR-based shopping experience, where vocal dialogues with an Amazon Alexa virtual assistant are supported, to experiment with a more natural and familiar contact with the store environment. To verify the validity of such an approach, we asked a group of fashion experts to try two different XR store experiences: with and without the voice assistant integration. The users are then asked to answer a questionnaire to rate their experiences. The results support the hypothesis that vocal interactions may contribute to increasing the acceptance and comfortable perception of XR-based fashion shopping

    Understanding Hand Interactions and Mid-Air Haptic Responses within Virtual Reality and Beyond.

    Get PDF
    Hand tracking has long been seen as a futuristic interaction, firmly situated into the realms of sci-fi. Recent developments and technological advancements have brought that dream into reality, allowing for real-time interactions by naturally moving and positioning your hand. While these developments have enabled numerous research projects, it is only recently that businesses and devices are truly starting to implement and integrate the technology into their different sectors. Numerous devices are shifting towards a fully self- contained ecosystem, where the removal of controllers could significantly help in reducing barriers to entry. Prior studies have focused on the effects or possible areas for implementation of hand tracking, but rarely focus on the direct comparisons of technologies, nor do they attempt to reproduce lost capabilities. With this prevailing background, the work presented in this thesis aims to understand the benefits and negatives of hand tracking when treated as the primary interaction method within virtual reality (VR) environments. Coupled with this, the implementation and usage of novel mid-air ultrasound-based haptics attempt to reintroduce feedback that would have been achieved through conventional controller interactions. Two unique user studies were undertaken, testing core underlying interactions within VR that represent common instances found throughout simulations. The first study focuses on the interactions presented within 3D VR user interfaces, with a core topic of buttons. While the second study directly compares input and haptic modalities within two different fine motor skill tasks. These studies are coupled with the development and implementation of a real-time user study recording toolkit, allowing for significantly heightened user analysis and visual evaluation of interactions. Results from these studies and developments make valuable contributions to the research and business knowledge of hand tracking interactions, as well as providing a uniquely valuable open-source toolkit for other researchers to use. This thesis covers work undertaken at Ultraleap over varying projects between 2018 and 2021
    corecore