47,018 research outputs found
A virtual environment for the design and simulated construction of prefabricated buildings
The construction industry has acknowledged that its current working practices are in need of substantial improvements in quality and efficiency and has identified that computer modelling techniques and the use of prefabricated components can help reduce times, costs, and minimise defects and problems of on-site construction. This paper describes a virtual environment to support the design and construction processes of buildings from prefabricated components and the simulation of their construction sequence according to a project schedule. The design environment can import a library of 3-D models of prefabricated modules that can be used to interactively design a building. Using Microsoft Project, the construction schedule of the designed building can be altered, with this information feeding back to the construction simulation environment. Within this environment the order of construction can be visualised using virtual machines. Novel aspects of the system are that it provides a single 3-D environment where the user can construct their design with minimal user interaction through automatic constraint recognition and view the real-time simulation of the construction process within the environment. This takes this area of research a step forward from other systems that only allow the planner to view the construction at certain stages, and do not provide an animated view of the construction process
MoSculp: Interactive Visualization of Shape and Time
We present a system that allows users to visualize complex human motion via
3D motion sculptures---a representation that conveys the 3D structure swept by
a human body as it moves through space. Given an input video, our system
computes the motion sculptures and provides a user interface for rendering it
in different styles, including the options to insert the sculpture back into
the original video, render it in a synthetic scene or physically print it.
To provide this end-to-end workflow, we introduce an algorithm that estimates
that human's 3D geometry over time from a set of 2D images and develop a
3D-aware image-based rendering approach that embeds the sculpture back into the
scene. By automating the process, our system takes motion sculpture creation
out of the realm of professional artists, and makes it applicable to a wide
range of existing video material.
By providing viewers with 3D information, motion sculptures reveal space-time
motion information that is difficult to perceive with the naked eye, and allow
viewers to interpret how different parts of the object interact over time. We
validate the effectiveness of this approach with user studies, finding that our
motion sculpture visualizations are significantly more informative about motion
than existing stroboscopic and space-time visualization methods.Comment: UIST 2018. Project page: http://mosculp.csail.mit.edu
Recommended from our members
Towards Rapid Generation and Visualisation of Large 3D Urban Landscapes for Mobile Device Navigation
In this paper a procedural 3D modelling solution for mobile devices is presented based on scripting algorithms allowing for both the automatic and also semi-automatic creation of photorealistic quality virtual urban content. The combination of aerial images, GIS data, 2D ground maps and terrestrial photographs as input data coupled with a user-friendly customized interface permits the automatic and interactive generation of large-scale, accurate, georeferenced and fully-textured 3D virtual city content, content that can be specially optimized for use with mobile devices but also with navigational tasks in mind. Furthermore, a user-centred mobile virtual reality (VR) visualisation and interaction tool operating on PDAs (Personal Digital Assistants) for pedestrian navigation is also discussed. Via this engine, the import and display of various navigational file formats (2D and 3D) is supported, including a comprehensive front-end user-friendly graphical user interface providing immersive virtual 3D navigation
Immersive and non immersive 3D virtual city: decision support tool for urban sustainability
Sustainable urban planning decisions must not only consider the physical structure of the urban development but the economic, social and environmental factors. Due to the prolonged times scales of major urban development projects the current and future impacts of any decision made must be fully understood. Many key project decisions are made early in the decision making process with decision makers later seeking agreement for proposals once the key decisions have already been made, leaving many stakeholders, especially the general public, feeling marginalised by the process. Many decision support tools have been developed to aid in the decision making process, however many of these are expert orientated, fail to fully address spatial and temporal issues and do not reflect the interconnectivity of the separate domains and their indicators. This paper outlines a platform that combines computer game techniques, modelling of economic, social and environmental indicators to provide an interface that presents a 3D interactive virtual city with sustainability information overlain. Creating a virtual 3D urban area using the latest video game techniques ensures: real-time rendering of the 3D graphics; exploitation of novel techniques of how complex multivariate data is presented to the user; immersion in the 3D urban development, via first person navigation, exploration and manipulation of the environment with consequences updated in real-time. The use of visualisation techniques begins to remove sustainability assessment’s reliance on the existing expert systems which are largely inaccessible to many of the stakeholder groups, especially the general public
Using high resolution displays for high resolution cardiac data
The ability to perform fast, accurate, high resolution visualization is fundamental
to improving our understanding of anatomical data. As the volumes of data
increase from improvements in scanning technology, the methods applied to rendering
and visualization must evolve. In this paper we address the interactive display of
data from high resolution MRI scanning of a rabbit heart and subsequent histological
imaging. We describe a visualization environment involving a tiled LCD panel
display wall and associated software which provide an interactive and intuitive user
interface.
The oView software is an OpenGL application which is written for the VRJuggler
environment. This environment abstracts displays and devices away from the
application itself, aiding portability between different systems, from desktop PCs to
multi-tiled display walls. Portability between display walls has been demonstrated
through its use on walls at both Leeds and Oxford Universities. We discuss important
factors to be considered for interactive 2D display of large 3D datasets,
including the use of intuitive input devices and level of detail aspects
- …