4,941 research outputs found

    Smartphone Augmented Reality Applications for Tourism

    Get PDF
    Invisible, attentive and adaptive technologies that provide tourists with relevant services and information anytime and anywhere may no longer be a vision from the future. The new display paradigm, stemming from the synergy of new mobile devices, context-awareness and AR, has the potential to enhance tourists’ experiences and make them exceptional. However, effective and usable design is still in its infancy. In this publication we present an overview of current smartphone AR applications outlining tourism-related domain-specific design challenges. This study is part of an ongoing research project aiming at developing a better understanding of the design space for smartphone context-aware AR applications for tourists

    Synergy: An Energy Monitoring and Visualization System

    Get PDF
    The key to becoming a more sustainable society is first learning to take responsibility for the role we play in energy consumption. Real-time energy usage gives energy consumers a sense of responsibility over what they can do to accomplish a much larger goal for the planet, and practically speaking, what they can do to lower the cost to their wallets. Synergy is an energy monitoring and visualization system that enables users to gather information about the energy consumption in a building – small or large – and display that data for the user in real-time. The gathered energy usage data is processed on the edge before being stored in the cloud. The two main benefits of edge processing are issuing electricity hazard warnings immediately and preserving user privacy. In addition to being a scalable solution that intended for use in individual households, commercial offices and city power grids, Synergy is open-source so that it can be implemented more widely. This paper contains a system overview as well as initial finding based on the data collected by Synergy before assessing the impact the system can have on society

    Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming

    Get PDF
    Modern livestock farm technologies allow operators to have access to a multitude of data thanks to the high number of mobile and fixed sensors available on both the livestock farming machinery and the animals. These data can be consulted via PC, tablet, and smartphone, which must be handheld by the operators, leading to an increase in the time needed for on-field activities. In this scenario, the use of augmented reality smart glasses could allow the visualization of data directly in the field, providing for a hands-free environment for the operator to work. Nevertheless, to visualize specific animal information, a connection between the augmented reality smart glasses and electronic animal identification is needed. Therefore, the main objective of this study was to develop and test a wearable framework, called SmartGlove that is able to link RFID animal tags and augmented reality smart glasses via a Bluetooth connection, allowing the visualization of specific animal data directly in the field. Moreover, another objective of the study was to compare different levels of augmented reality technologies (assisted reality vs. mixed reality) to assess the most suitable solution for livestock management scenarios. For this reason, the developed framework and the related augmented reality smart glasses applications were tested in the laboratory and in the field. Furthermore, the stakeholders’ point of view was analyzed using two standard questionnaires, the NASA-Task Load Index and the IBM-Post Study System Usability Questionnaire. The outcomes of the laboratory tests underlined promising results regarding the operating performances of the developed framework, showing no significant differences if compared to a commercial RFID reader. During the on-field trial, all the tested systems were capable of performing the task in a short time frame. Furthermore, the operators underlined the advantages of using the SmartGlove system coupled with the augmented reality smart glasses for the direct on-field visualization of animal data

    Collision warning design in automotive head-up displays

    Get PDF
    Abstract. In the last few years, the automotive industry has experienced a large growth in the hardware and the underlying electronics. The industry benefits from both Human Machine Interface (HMI) research and modern technology. There are many applications of the Advanced Driver Assistant System (ADAS) and their positive impact on drivers is even more. Forward Collision Warning (FCW) is one of many applications of ADAS. In the last decades, different approaches and tools are used to implement FCW systems. Current Augmented Reality (AR) applications are feasible to integrate in modern cars. In this thesis work, we introduce three different FCW designs: static, animated and 3D animated warnings. We test the proposed designs in three different environments: day, night and rain. The designs static and animated achieve a minimum response time 0.486 s whereas the 3D animated warning achieves 1.153 s

    Augmented Reality Based Assisted Healthcare for Enhancing Medical Rescue and Doctor-Patient Consultations with AR-Headset

    Get PDF
    Technological progress in the healthcare sector has created new opportunities to enhance surgical precision and patient care. This study explores how augmented reality (AR) technology can help the medical industry close the gap between the digital and physical realms. AR has the potential to improve communication between doctors and patients, make complex medical information easier to understand, and help medical professionals diagnose and treat patients more successfully. It can reduce the need for printed pictures and diagrams by superimposing digital information onto a patient's body to provide surgeons a clearer understanding of the anatomy they are working on. The purpose of this research paper is to develop and deploy an assistive system that uses augmented reality (AR) technology to give medical professionals and rescue teams vital information. The solution smoothly combines digital data with the real-world perspective by integrating semi-transparent glasses into AR-Headsets. Enhancing patient-physician consultation and increasing the knowledge of complicated medical information is the main goal. The creation, application, and possible effects of this AR-based system in the healthcare sector are covered in this study

    The sweet smell of success: Enhancing multimedia applications with olfaction

    Get PDF
    This is the Post-Print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACMOlfaction, or smell, is one of the last challenges which multimedia applications have to conquer. As far as computerized smell is concerned, there are several difficulties to overcome, particularly those associated with the ambient nature of smell. In this article, we present results from an empirical study exploring users' perception of olfaction-enhanced multimedia displays. Findings show that olfaction significantly adds to the user multimedia experience. Moreover, use of olfaction leads to an increased sense of reality and relevance. Our results also show that users are tolerant of the interference and distortion effects caused by olfactory effect in multimedia

    AUGMENTED REALITY AND MOBILE SYSTEMS FOR HEAVY EQUIPMENT OPERATORS IN SURFACE MINING

    Get PDF
    U.S. federal laws mandate that mining companies ensure a safe workplace, implement approved training programs, and promptly report work-related injuries. The mining industry\u27s commitment to innovation reflects a history of adopting advancements to enhance environmental sustainability, workplace safety, and overall productivity, while simultaneously reducing operational costs. This thesis proposes the integration of Augmented Reality (AR) technology and digital applications to enhance the surface mining industry, presenting two innovative solutions: an AR Training System and an Operational Digital System. These business solutions have been developed and applied at a surface mine in the southwest of the US, having the potential to improve the mining industry by enhancing safety, training, operational efficiency, and data-driven decision-making, which comprehends a significant step toward a more sustainable, effective, and technologically driven mining sector, contributing to the industry\u27s evolution and growth. The AR Training System leverages Microsoft´s Power Platform and HoloLens 2 capacities to provide operators with immersive and step-by-step training guides in real working conditions for Dozers, Motor Graders, and End Dump trucks. These AR guides combine 3D models, videos, photos, and interactive elements overlapping mining equipment to enhance learning and safety. The system also offers an efficient approach to data collection during operator training, which has the potential to modify the training guides based on user performance. On the other hand, the Operational Digital System addresses the industry\u27s operational challenges. It streamlines the pre-operation inspection process, tracks equipment status, and accelerates defect identification, shift timing, delays, and loaded tonnage. The system offers a holistic approach to mining operation optimization, facilitating data sharing and management among different departments, enhancing collaboration, and expediting maintenance processes

    Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery

    Get PDF
    Multiple sensorial media (mulsemedia) combines multiple media elements which engage three or more of human senses, and as most other media content, requires support for delivery over the existing networks. This paper proposes an adaptive mulsemedia framework (ADAMS) for delivering scalable video and sensorial data to users. Unlike existing two-dimensional joint source-channel adaptation solutions for video streaming, the ADAMS framework includes three joint adaptation dimensions: video source, sensorial source, and network optimization. Using an MPEG-7 description scheme, ADAMS recommends the integration of multiple sensorial effects (i.e., haptic, olfaction, air motion, etc.) as metadata into multimedia streams. ADAMS design includes both coarse- and fine-grained adaptation modules on the server side: mulsemedia flow adaptation and packet priority scheduling. Feedback from subjective quality evaluation and network conditions is used to develop the two modules. Subjective evaluation investigated users' enjoyment levels when exposed to mulsemedia and multimedia sequences, respectively and to study users' preference levels of some sensorial effects in the context of mulsemedia sequences with video components at different quality levels. Results of the subjective study inform guidelines for an adaptive strategy that selects the optimal combination for video segments and sensorial data for a given bandwidth constraint and user requirement. User perceptual tests show how ADAMS outperforms existing multimedia delivery solutions in terms of both user perceived quality and user enjoyment during adaptive streaming of various mulsemedia content. In doing so, it highlights the case for tailored, adaptive mulsemedia delivery over traditional multimedia adaptive transport mechanisms

    Promoting reality awareness in virtual reality through proxemics

    Get PDF
    Head-Mounted Virtual reality (VR) systems provide full-immersive experiences to users and completely isolate them from the outside world, placing them in unsafe situations. Existing research proposed different alert-based solutions to address this. Our work builds on these studies on notification systems for VR environments from a different perspective. We focus on: (i) exploring alert systems to notify VR users about non-immersed bystanders' in socially related, non-critical interaction contexts; (ii) understanding how best to provide awareness of non-immersed bystanders while maintaining presence and immersion within the Virtual Environment(VE). To this end, we developed single and combined alert cues - leveraging proxemics, perception channels, and push/pull approaches and evaluated those via two user studies. Our findings indicate a strong preference towards maintaining immersion and combining audio and visual cues, push and pull notification techniques that evolve dynamically based on proximity
    corecore