44 research outputs found

    XRSpotlight: Example-based Programming of XR Interactions using a Rule-based Approach

    Get PDF
    Research on enabling novice AR/VR developers has emphasized the need to lower the technical barriers to entry. This is often achieved by providing new authoring tools that provide simpler means to implement XR interactions through abstraction. However, novices are then bound by the ceiling of each tool and may not form the correct mental model of how interactions are implemented. We present XRSpotlight, a system that supports novices by curating a list of the XR interactions defined in a Unity scene and presenting them as rules in natural language. Our approach is based on a model abstraction that unifies existing XR toolkit implementations. Using our model, XRSpotlight can find incomplete specifications of interactions, suggest similar interactions, and copy-paste interactions from examples using different toolkits. We assess the validity of our model with professional VR developers and demonstrate that XRSpotlight helps novices understand how XR interactions are implemented in examples and apply this knowledge in their projects

    Defining Configurable Virtual Reality Templates for End Users

    Get PDF
    This paper proposes a solution for supporting end users in configuring Virtual Reality environments by exploiting reusable templates created by experts. We identify the roles participating in the environment development and the means for delegating part of the behaviour definition to the end users. We focus in particular on enabling end users to define the environment behaviour. The solution exploits a taxonomy defining common virtual objects having high-level actions for specifying event-condition-Action rules readable as natural language sentences. End users exploit such actions to define the environment behaviour. We report on a proof-of-concept implementation of the proposed approach, on its validation through two different case studies (virtual shop and museum), and on evaluating the approach with expert users

    Advancements in combining electronic animal identification and augmented reality technologies in digital livestock farming

    Get PDF
    Modern livestock farm technologies allow operators to have access to a multitude of data thanks to the high number of mobile and fixed sensors available on both the livestock farming machinery and the animals. These data can be consulted via PC, tablet, and smartphone, which must be handheld by the operators, leading to an increase in the time needed for on-field activities. In this scenario, the use of augmented reality smart glasses could allow the visualization of data directly in the field, providing for a hands-free environment for the operator to work. Nevertheless, to visualize specific animal information, a connection between the augmented reality smart glasses and electronic animal identification is needed. Therefore, the main objective of this study was to develop and test a wearable framework, called SmartGlove that is able to link RFID animal tags and augmented reality smart glasses via a Bluetooth connection, allowing the visualization of specific animal data directly in the field. Moreover, another objective of the study was to compare different levels of augmented reality technologies (assisted reality vs. mixed reality) to assess the most suitable solution for livestock management scenarios. For this reason, the developed framework and the related augmented reality smart glasses applications were tested in the laboratory and in the field. Furthermore, the stakeholders’ point of view was analyzed using two standard questionnaires, the NASA-Task Load Index and the IBM-Post Study System Usability Questionnaire. The outcomes of the laboratory tests underlined promising results regarding the operating performances of the developed framework, showing no significant differences if compared to a commercial RFID reader. During the on-field trial, all the tested systems were capable of performing the task in a short time frame. Furthermore, the operators underlined the advantages of using the SmartGlove system coupled with the augmented reality smart glasses for the direct on-field visualization of animal data

    Poster: Programming Rules by Demonstration in Virtual Reality

    No full text
    This paper proposes a solution based on programming-by-demonstration supporting the definition of rules for configuring the behaviour of Virtual Reality environments by end users. The interface allows end users to record the interaction and demonstrate the updates in an immersive mode. After the recording phase, it segments the updates and asks the user to identify which ones are useful for the rule definition and which are not. Finally, the user graphically separates the selected updates into triggers and actions, obtaining the desired configuration rules

    Weft and Warp: the Identity Path (Flussio, Planargia, Sardinia, Italy)

    No full text
    The town of Flussio is located in the heart of Planargia, a Sardinian sub-region sited in the north-west of the Island. The geomorphological variety has defined not only the men’s characters and history but also has ridden the appearance of the communities and of the landscape. The micro-region of the Rio Mannu valley seemed an ideal context by interpetating the genesis factors wich the landscape hide in his characteristic patterns and signs. Analyzing these dynamics is not such turning a nostalgic look into the past, but it is to declare the identity of the communities who lives there now. Infact, if they are abled to know the strenght of their roots, they will take an active and conscious part in the development of their territory

    FeedBucket: Simplified Haptic Feedback for VR and MR

    No full text
    Standard development libraries for Virtual and Mixed Reality support haptic feedback through low-level parameters, which do not guide developers in creating effective interactions. In this paper, we report some preliminary results on a simplified structure for the creation, assignment and execution of haptic feedback for standard controllers with the optional feature of synchronizing an haptic pattern to an auditory feedback. In addition, we present the results of a preliminary test investigating the users' ability in recognizing variations in intensity and/or duration of the stimulus, especially when the two dimensions are combined for encoding information

    Smart Glove: Development and Testing of a Wearable RFID Reader Connected to Mixed Reality Smart Glasses

    No full text
    The evolution of Precision livestock farming has given the possibility to the farmer to obtain a large amount of information. However, to access the database, the operator often is forced to use PC or mobile devices, losing time during on-fields activities. In this context, the use of mixed reality smart glasses (MRSG), such as HoloLens (Microsoft, USA), represents an interesting tool to consult the information in real-time leaving the operator hands-free. The aim of the study was to decrease the technological gap between the animal’s electronic identification to all the information linked to a specific animal. Specifically, the research focused on developing a smart wearable system composed of a prototype, called SmartGlove (SG), capable to link the RFID tag and MRSG, via Bluetooth connection. The MRSG displays all the information related to the animal stored in an online or offline database thanks to the development of a dedicated software. The use of the SG system allows the operator to visualize, monitor, and modify the information related to the animal during on-field activities. From the preliminary experiments, the system shows promising in reducing the required workforce and improving productivity in farm management. The SmartGlove system allow farmers to activate both ear tags and rumen bolus at a distance up to 5 cm, and to visualize animal data in a variable timeframe of 4.3 s for ear tag and 3.8 s for the rumen bolus. Future works will aim to improve the reading capability of the SmartGlove, improving the antenna performances, expand the battery life, and to upgrade the MRSG software
    corecore