6,589 research outputs found

    Using Virtual Reality to increase technical performance during rowing workouts

    Get PDF
    Technology is advancing rapidly in virtual reality (VR) and sensors, gathering feedback from our body and the environment we are interacting in. Combining the two technologies gives us the opportunity to create personalized and reactive immersive environments. These environments can be used e.g. for training in dangerous situations (e.g. fire, crashes, etc), or to improve skills with less distraction than regular natural environments would have. The pilot study described in this thesis puts an athlete who is rowing on a stationary rowing machine into a virtual environment. The VR takes movement from several sensors of the ergo-meter and displays those in VR. In addition, metrics on technique are being derived from the sensor data and physiological data. All this is used to investigate if, and to which extent, VR may improve the technical skills of the athlete during the complex sport of rowing. Furthermore, athletes are giving subjective feedback about their experience comparing a standard rowing workout, with the workout using VR. First results indicate better performance and an enhanced experience by the athlete

    Multimodal augmented reality tangible gaming

    Get PDF
    This paper presents tangible augmented reality gaming environment that can be used to enhance entertainment using a multimodal tracking interface. Players can interact using different combinations between a pinch glove, a Wiimote, a six-degrees-of-freedom tracker, through tangible ways as well as through I/O controls. Two tabletop augmented reality games have been designed and implemented including a racing game and a pile game. The goal of the augmented reality racing game is to start the car and move around the track without colliding with either the wall or the objects that exist in the gaming arena. Initial evaluation results showed that multimodal-based interaction games can be beneficial in gaming. Based on these results, an augmented reality pile game was implemented with goal of completing a circuit of pipes (from a starting point to an end point on a grid). Initial evaluation showed that tangible interaction is preferred to keyboard interaction and that tangible games are much more enjoyable

    Games against health: a player-centered design philosophy

    Get PDF
    This paper announces the “Games Against Health” (GAH) research agenda, a criticism of, and response to, the cultural imperialism of the “Games for Health” paradigm. Committed to player-centric design ethics, GAH seeks to dismantle the “games for health” myth as neo-liberal elitist diktat. We acknowledge the values, tastes and pleasures of billions of game players worldwide. We argue that game designers should engage more efficiently in the disimprovement of player health and wellbeing in order to cater to those players’ existing preferences. We hope the paper can serve as a convenient reference for those designing psychotic, sociopathic or antisocial games

    FlightGoggles: A Modular Framework for Photorealistic Camera, Exteroceptive Sensor, and Dynamics Simulation

    Full text link
    FlightGoggles is a photorealistic sensor simulator for perception-driven robotic vehicles. The key contributions of FlightGoggles are twofold. First, FlightGoggles provides photorealistic exteroceptive sensor simulation using graphics assets generated with photogrammetry. Second, it provides the ability to combine (i) synthetic exteroceptive measurements generated in silico in real time and (ii) vehicle dynamics and proprioceptive measurements generated in motio by vehicle(s) in a motion-capture facility. FlightGoggles is capable of simulating a virtual-reality environment around autonomous vehicle(s). While a vehicle is in flight in the FlightGoggles virtual reality environment, exteroceptive sensors are rendered synthetically in real time while all complex extrinsic dynamics are generated organically through the natural interactions of the vehicle. The FlightGoggles framework allows for researchers to accelerate development by circumventing the need to estimate complex and hard-to-model interactions such as aerodynamics, motor mechanics, battery electrochemistry, and behavior of other agents. The ability to perform vehicle-in-the-loop experiments with photorealistic exteroceptive sensor simulation facilitates novel research directions involving, e.g., fast and agile autonomous flight in obstacle-rich environments, safe human interaction, and flexible sensor selection. FlightGoggles has been utilized as the main test for selecting nine teams that will advance in the AlphaPilot autonomous drone racing challenge. We survey approaches and results from the top AlphaPilot teams, which may be of independent interest.Comment: Initial version appeared at IROS 2019. Supplementary material can be found at https://flightgoggles.mit.edu. Revision includes description of new FlightGoggles features, such as a photogrammetric model of the MIT Stata Center, new rendering settings, and a Python AP

    WILL NFTS BE THE BEST DIGITAL ASSET FOR THE METAVERSE?

    Get PDF
    The purpose of the current research is to explore the concept of non-fungible tokens created by blockchain technology and its applicability in the Metaverse. Metaverse is a hypothesized and an iteration of the real world. Metaverse is not a new concept. The concept itself has appeared in the 1990s and some of the online games tried to realize the Metaverse world (e.g., Second Life and Cyworld). Nowadays, Metaverse becomes feasible under supported by technologies, including virtual reality, augmented reality, cloud computing, Internet-of-Things, 5G, and Big Data. In Metaverse, there is no doubt that residents in Metaverse may trade or transact assets or goods with one another. Our research tries to tie the concept of non-fungible tokens in Metaverse as the best digital asset to secure the originality of the goods in Metaverse. Detailed discussions are made in the following section

    Analyzing the Impact of Spatio-Temporal Sensor Resolution on Player Experience in Augmented Reality Games

    Get PDF
    Along with automating everyday tasks of human life, smartphones have become one of the most popular devices to play video games on due to their interactivity. Smartphones are embedded with various sensors which enhance their ability to adopt new new interaction techniques for video games. These integrated sen- sors, such as motion sensors or location sensors, make the device able to adopt new interaction techniques that enhance usability. However, despite their mobility and embedded sensor capacity, smartphones are limited in processing power and display area compared to desktop computer consoles. When it comes to evaluat- ing Player Experience (PX), players might not have as compelling an experience because the rich graphics environments that a desktop computer can provide are absent on a smartphone. A plausible alternative in this regard can be substituting the virtual game world with a real world game board, perceived through the device camera by rendering the digital artifacts over the camera view. This technology is widely known as Augmented Reality (AR). Smartphone sensors (e.g. GPS, accelerometer, gyro-meter, compass) have enhanced the capability for deploying Augmented Reality technology. AR has been applied to a large number of smartphone games including shooters, casual games, or puzzles. Because AR play environments are viewed through the camera, rendering the digital artifacts consistently and accurately is crucial because the digital characters need to move with respect to sensed orientation, then the accelerometer and gyroscope need to provide su ciently accurate and precise readings to make the game playable. In particular, determining the pose of the camera in space is vital as the appropriate angle to view the rendered digital characters are determined by the pose of the camera. This defines how well the players will be able interact with the digital game characters. Depending in the Quality of Service (QoS) of these sensors, the Player Experience (PX) may vary as the rendering of digital characters are affected by noisy sensors causing a loss of registration. Confronting such problem while developing AR games is di cult in general as it requires creating wide variety of game types, narratives, input modalities as well as user-testing. Moreover, current AR games developers do not have any specific guidelines for developing AR games, and concrete guidelines outlining the tradeoffs between QoS and PX for different genres and interaction techniques are required. My dissertation provides a complete view (a taxonomy) of the spatio-temporal sensor resolution depen- dency of the existing AR games. Four user experiments have been conducted and one experiment is proposed to validate the taxonomy and demonstrate the differential impact of sensor noise on gameplay of different genres of AR games in different aspect of PX. This analysis is performed in the context of a novel instru- mentation technology, which allows the controlled manipulation of QoS on position and orientation sensors. The experimental outcome demonstrated how the QoS of input sensor noise impacts the PX differently while playing AR game of different genre and the key elements creating this differential impact are - the input modality, narrative and game mechanics. Later, concrete guidelines are derived to regulate the sensor QoS as complete set of instructions to develop different genres or AR games
    corecore