7 research outputs found

    CoVR: A Large-Scale Force-Feedback Robotic Interface for Non-Deterministic Scenarios in VR

    Full text link
    We present CoVR, a novel robotic interface providing strong kinesthetic feedback (100 N) in a room-scale VR arena. It consists of a physical column mounted on a 2D Cartesian ceiling robot (XY displacements) with the capacity of (1) resisting to body-scaled users' actions such as pushing or leaning; (2) acting on the users by pulling or transporting them as well as (3) carrying multiple potentially heavy objects (up to 80kg) that users can freely manipulate or make interact with each other. We describe its implementation and define a trajectory generation algorithm based on a novel user intention model to support non-deterministic scenarios, where the users are free to interact with any virtual object of interest with no regards to the scenarios' progress. A technical evaluation and a user study demonstrate the feasibility and usability of CoVR, as well as the relevance of whole-body interactions involving strong forces, such as being pulled through or transported.Comment: 10 pages (without references), 14 pages tota

    A Visual Approach to Construction Cost Estimating

    Get PDF
    Construction cost estimating is considered one of the most important and critical phases of a construction project. Preparing reliable and accurate estimates to help decision makers is the most challenging assignment that estimators face. An estimate is not only necessary for proposal preparation but also for several project management functions. Despite the importance of estimating, it has remained a very time consuming process. The most inefficient part of construction cost estimating is determination of the amount of resources needed for the construction of a project. This is also known as quantity takeoff. Quantity takeoff is a very long and error-prone process that is performed manually by estimators. Missing or duplicating work items are among the errors that can occur during the quantity takeoff process. New Parametric CAD software has recently attained widespread attention in the Architectural, Engineering, and Construction (AEC) industry. It represents the development and use of computer-generated models to simulate the planning, design, construction and operation of a facility. It helps architects, engineers, and contractors visualize what is to be built in a simulated environment and to identify potential design, construction or operational problems. The model created from parametric CAD software will significantly increase construction cost estimator productivity by substantially reducing the manual work necessary for performing quantity takeoffs. This study presents a methodology that uses parametric CAD software and visualization technologies to streamline the estimating process. Although this methodology won\u27t totally automate the estimating process, it will help in the following areas: (1) providing a navigable 3D model of the project, (2) simplifying the quantity takeoff process, and (3) eliminating manual calculations and search for data. This study uses visualization technologies to navigate through a 3D CAD model. This would provide the estimator with a tool to improve the understanding of the location and relationships between elements in a model. The quantity takeoff process may be simplified by using properties and geometry information extracted from the 3D CAD model. This study also uses a database technology to store labor, equipment, and material cost data. This helps eliminate manual calculations and enables an estimator to search for data stored in the database. A case study is presented to illustrate the process and capabilities of the developed system

    Three-dimensional interactive maps: theory and practice

    Get PDF

    An integrated software approach to interactive exploration and steering of fluid flow simulations on many-core architectures

    Get PDF
    Traditionell werden numerische Strömungssimulationen in einer zyklischen Sequenz autonomer Teilschritte durchgeführt. Seitens Wissenschaftlern existiert jedoch schon lange der Wunsch nach mehr Interaktion mit laufenden Simulationen. Seit dem maßgeblichen Report der National Science Foundation im Jahre 1987 wurden daher neue Formen der wissenschaftlichen Visualisierung entwickelt, die sich grundlegend von den traditionellen Verfahren unterscheiden. Insbesondere hat der sogenannte Computational Steering-Ansatz reges Interesse bewirkt. Damals wie heute ist die Anwendung des Verfahrens jedoch eher die Ausnahme denn die Regel. Ursächlich dafür sind zu großen Teilen Komplexität und Restriktionen traditioneller Hochleistungssysteme. Im Rahmen dieser Arbeit wird daher als Alternative zu dem traditionellen Vorgehen die immense Leistungsfähigkeit moderner Grafikkartengenerationen für die Berechnungen herangezogen. Das sogenannte GPGPU-Computing eignet sich insbesondere für die Anwendung der Lattice-Boltzmann-Methode im Bereich numerischer Strömungssimulationen. Auf Grundlage des LBM-Verfahrens wird im Rahmen dieser Arbeit prototypisch eine interaktive Simulationsumgebung basierend auf dem Computational Steering-Paradigma entwickelt, das alle Prozesse zur Lösung von Strömungsproblemen innerhalb einer einzelnen Anwendung integriert. Durch die Konvergenz der hohen massiv parallelen Rechenleistung der GPUs und der Interaktionsfähigkeiten in einer einzelnen Anwendung kann eine erhebliche Steigerung der Anwendungsqualität erzielt werden. Dabei ist es durch Einsatz mehrerer GPUs möglich, dreidimensionale Strömungsprobleme mit praxisrelevanter Problemgröße zu berechnen und gleichzeitig eine interaktive Manipulation und Exploration des Strömungsgebiets zur Laufzeit zu ermöglichen. Dabei ist der erforderliche finanzielle Aufwand verglichen mit traditionellen massiv parallelen Verfahren verhältnismäßig gering.Traditionally, computational fluid dynamics is done in a cyclic sequence of independent steps. Howerver it is a long term wish of scientists and engineers to closely interact with their running simulations. Since the influential report of the US National Science Foundation in 1987 new forms of scientific visualization have evolved that are quite different from traditional post-processing. Especially the approach commonly referred to as computational steering has been the subject of widespread interest. Although it is a very powerful paradigm, the use of computational steering is still the exception rather than the rule. The reasons for this are more or less related to the complexity and restrictions of traditional HPC systems. As an alternative to the traditional massively parallel approach, in this thesis the parallel computational power of GPGPUs is used for general purpose applications. The so called GPGPU computing has gained large popularity in the CFD community, especially for its application to the lattice Boltzmann method. Using this technology this work demonstrates a single desktop application integrating a complete interactive CFD simulation environment for reasonable hardware costs. It shows that the convergence of massive parallel computational power and steering environment into a single system significantly improves the usability, application quality and user-friendliness. Using multiple GPUs, the efficiency of this approach allows for CFD simulations in three dimensional space evolving close to real-time even for reasonable grid sizes. Thereby, the simulation can be explored and also adjusted during runtime. The thesis also shows that the responsiveness significantly benefits from avoiding common bandwidth and latency bottlenecks inherent in traditional HPC approaches. Those can be avoided as GPGPU computing does not generally require network communication, which also reduces the complexity of the application

    Direct Manipulation in Virtual Reality

    No full text
    Virtual Reality interfaces offer several advantages for scientific visualization such as the ability to perceive three-dimensional data structures in a natural way. The focus of this chapter is direct manipulation, the ability for a user in virtual reality to control objects in the virtual environment in a direct and natural way, much as objects are manipulated in the real world. Direct manipulation provides many advantages for the exploration of complex, multi-dimensional data sets, by allowing the investigator the ability to intuitively explore the data environment. Because direct manipulation is essentially a control interface, it is better suited for the exploration and analysis of a data set than for the publishing or communication of features found in that data set. Thus direct manipulation is most relevant to the analysis of complex data that fills a volume of three-dimensional space, such as a fluid flow data set. Direct manipulation allows the intuitive exploration of that data, which facilitates the discovery of data features that would be difficult to find using more conventional visualization methods. Using a direct manipulation interface in virtual reality, an investigator can, for example, move a data probe about in space, watching the results and getting a sense of how the data varies within its spatial volume
    corecore