7,296 research outputs found

    Animating the evolution of software

    Get PDF
    The use and development of open source software has increased significantly in the last decade. The high frequency of changes and releases across a distributed environment requires good project management tools in order to control the process adequately. However, even with these tools in place, the nature of the development and the fact that developers will often work on many other projects simultaneously, means that the developers are unlikely to have a clear picture of the current state of the project at any time. Furthermore, the poor documentation associated with many projects has a detrimental effect when encouraging new developers to contribute to the software. A typical version control repository contains a mine of information that is not always obvious and not easy to comprehend in its raw form. However, presenting this historical data in a suitable format by using software visualisation techniques allows the evolution of the software over a number of releases to be shown. This allows the changes that have been made to the software to be identified clearly, thus ensuring that the effect of those changes will also be emphasised. This then enables both managers and developers to gain a more detailed view of the current state of the project. The visualisation of evolving software introduces a number of new issues. This thesis investigates some of these issues in detail, and recommends a number of solutions in order to alleviate the problems that may otherwise arise. The solutions are then demonstrated in the definition of two new visualisations. These use historical data contained within version control repositories to show the evolution of the software at a number of levels of granularity. Additionally, animation is used as an integral part of both visualisations - not only to show the evolution by representing the progression of time, but also to highlight the changes that have occurred. Previously, the use of animation within software visualisation has been primarily restricted to small-scale, hand generated visualisations. However, this thesis shows the viability of using animation within software visualisation with automated visualisations on a large scale. In addition, evaluation of the visualisations has shown that they are suitable for showing the changes that have occurred in the software over a period of time, and subsequently how the software has evolved. These visualisations are therefore suitable for use by developers and managers involved with open source software. In addition, they also provide a basis for future research in evolutionary visualisations, software evolution and open source development

    Animating Virtual Human for Virtual Batik Modeling

    Get PDF
    This research paper describes a development of animating virtual human for virtual batik modeling project. The objectives of this project are to animate the virtual human, to map the cloth with the virtual human body, to present the batik cloth, and to evaluate the application in terms of realism of virtual human look, realism of virtual human movement, realism of 3D scene, application suitability, application usability, fashion suitability and user acceptance. The final goal is to accomplish an animated virtual human for virtual batik modeling. There are 3 essential phases which research and analysis (data collection of modeling and animating technique), development (model and animate virtual human, map cloth to body and add a music) and evaluation (evaluation of realism of virtual human look, realism of virtual human movement, realism of props, application suitability, application usability, fashion suitability and user acceptance). The result for application usability is the highest percentage which 90%. Result show that this application is useful to the people. In conclusion, this project has met the objective, which the realism is achieved by used a suitable technique for modeling and animating

    The Oceans Above Us: An Augmented Reality Experience

    Get PDF
    Augmented reality holds the potential to be the new fabric of our everyday lives. Also known as AR, augmented reality is any technology that superimposes graphical information over a real-world environment, whether it be through a smartphone screen or visually projected onto the environment. Though it has existed in various forms for decades, augmented reality development is still widely considered the work of experts in technology-related fields. In November 2019, however, Adobe unveiled a new augmented reality development platform, Project Aero, along with boasts that the app’s intuitive design and integration with other Adobe programs would place AR creation into the hands of casual users with no background in augmented reality or 3D design. Almost simultaneously, the rapid spread of the COVID-19 pandemic virus resulted in travel restrictions, school and business closings, event cancellations, social distancing policies, and even mass quarantining by spring of 2020. Amidst the transition to working, studying, and socializing in physical isolation, augmented reality technology became a valuable platform for museum tours, art projects, product showcases, and much more. In The Oceans Above Us, I examine the potential of Project Aero to empower casual users to create dynamic, custom augmented reality scenes from scratch, as well as the obstacles that are currently limiting it from fully doing so. I take on the perspective of a teacher, business owner, medical professional, etc. with no augmented reality experience and attempt to create my own “virtual aquarium” animated AR experience from vision to iii final product, in order to test whether Project Aero is ready to become ubiquitous in the hands of professionals from across every industry

    Procedural Modeling and Physically Based Rendering for Synthetic Data Generation in Automotive Applications

    Full text link
    We present an overview and evaluation of a new, systematic approach for generation of highly realistic, annotated synthetic data for training of deep neural networks in computer vision tasks. The main contribution is a procedural world modeling approach enabling high variability coupled with physically accurate image synthesis, and is a departure from the hand-modeled virtual worlds and approximate image synthesis methods used in real-time applications. The benefits of our approach include flexible, physically accurate and scalable image synthesis, implicit wide coverage of classes and features, and complete data introspection for annotations, which all contribute to quality and cost efficiency. To evaluate our approach and the efficacy of the resulting data, we use semantic segmentation for autonomous vehicles and robotic navigation as the main application, and we train multiple deep learning architectures using synthetic data with and without fine tuning on organic (i.e. real-world) data. The evaluation shows that our approach improves the neural network's performance and that even modest implementation efforts produce state-of-the-art results.Comment: The project web page at http://vcl.itn.liu.se/publications/2017/TKWU17/ contains a version of the paper with high-resolution images as well as additional materia

    Visualization and Animation of a Missile/Target Encounter

    Get PDF
    Existing missile/target encounter modeling and simulation systems focus on improving probability of kill models. Little research has been done to visualize these encounters. These systems can be made more useful to the engineers by incorporating current computer graphics technology for visualizing and animating the encounter. Our research has been to develop a graphical simulation package for visualizing both endgame and full fly-out encounters. Endgame visualization includes showing the interaction of a missile, its fuze cone proximity sensors, and its target during the final fraction of a second of the missile/target encounter. Additionally, this system displays dynamic effects such as the warhead fragmentation pattern and the specific skewing of the fragment scattering due to missile yaw at the point of detonation. Fly-out visualization, on the other hand, involves full animation of a missile from launch to target. Animating the results of VisSim fly-out simulations provides the engineer a more efficient means of analyzing his data. This research also involves investigating fly-out animation via the World Wide Web

    Experimental archeology and serious games: challenges of inhabiting virtual heritage

    Get PDF
    Experimental archaeology has long yielded valuable insights into the tools and techniques that featured in past peoples’ relationship with the material world around them. However, experimental archaeology has, hitherto, confined itself to rigid, empirical and quantitative questions. This paper applies principles of experimental archaeology and serious gaming tools in the reconstructions of a British Iron Age Roundhouse. The paper explains a number of experiments conducted to look for quantitative differences in movement in virtual vs material environments using both “virtual” studio reconstruction as well as material reconstruction. The data from these experiments was then analysed to look for differences in movement which could be attributed to artefacts and/or environments. The paper explains the structure of the experiments, how the data was generated, what theories may make sense of the data, what conclusions have been drawn and how serious gaming tools can support the creation of new experimental heritage environments

    Development of an online multiplayer game with augmented reality capabilities for android devices

    Get PDF
    Treball final de Grau en Disseny i Desenvolupament de Videojocs. Codi: VJ1241. Curs acadèmic: 2017/2018This End-of-Degree project consists of the development of an Online Multiplayer game using Augmented Reality technology. The objective of the project was to develop a multiplayer online game that used the camera device to place a digital environment on a flat surface using Augmented Reality technology where the players would have played. There is no need for the players to be together in the same room as the scenario is shared across all the players. The game is a “Bomberman” like type of game with fully network capabilities developed using Unity3D and is played on Android devices. The users will play on a predefined arena in which they can move and drop bombs using a virtual joystick and a virtual button. The bombs explode after three seconds and have a certain blast radius. The objective is to eliminate the other players using the bombs. There are crates around the arena that can be destroyed by the explosions and have a chance to drop different power-ups

    A study on virtual reality and developing the experience in a gaming simulation

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Masters by ResearchVirtual Reality (VR) is an experience where a person is provided with the freedom of viewing and moving in a virtual world [1]. The experience is not constrained to a limited control. Here, it was triggered interactively according to the user’s physical movement [1] [2]. So the user feels as if they are seeing the real world; also, 3D technologies allow the viewer to experience the volume of the object and its prospection in the virtual world [1]. The human brain generates the depth when each eye receives the images in its point of view. For learning for and developing the project using the university’s facilities, some of the core parts of the research have been accomplished, such as designing the VR motion controller and VR HMD (Head Mount Display), using an open source microcontroller. The VR HMD with the VR controller gives an immersive feel and a complete VR system [2]. The motive was to demonstrate a working model to create a VR experience on a mobile platform. Particularly, the VR system uses a micro electro-mechanical system to track motion without a tracking camera. The VR experience has also been developed in a gaming simulation. To produce this, Maya, Unity, Motion Analysis System, MotionBuilder, Arduino and programming have been used. The lessons and codes taken or improvised from [33] [44] [25] and [45] have been studied and implemented
    • …
    corecore