1,670 research outputs found
A novel visualisation paradigm for three-dimensional map-based mobile services
Estágio realizado na NDrive Navigation Systems, S. A.Tese de mestrado integrado. Engenharia Informátca e Computação. Faculdade de Engenharia. Universidade do Porto. 200
Investigating rendering speed and download rate of three-dimension (3D) mobile map intended for navigation aid using genetic algorithm
Prior studies have shown that rendering 3D map dataset in mobile device in a wireless network depends on the download speed. Crucial to that is the mobile device computing resource capabilities. Now it has become possible with a wireless network to render large and detailed 3D map of cities in mobile devices at interactive rates of over 30 frame rate per second (fps). The information in 3D map is generally limited and lack interaction when it’s not rendered at interactive rate; on the other hand, with high download rate 3D map is able to produce a realistic scene for navigation aid. Unfortunately, in most mobile navigation aid that uses a 3D map over a wireless network could not serve the needs of interaction, because it suffers from low rendering speed. This paper investigates the trade-off between rendering speed and download rate of the 3D mobile map using genetic algorithm (GA). The reason of using GA is because it takes larger problem space than other algorithms for optimization, which is well suited for establishing fast 3D map rendering speed on-the-fly to the mobile device that requires useful solutions for optimization. Regardless of mobile device’s computing resources, our finding from GA suggest that download rate and rendering speed are mutually exclusive. Thus, manipulated static aerial photo-realistic images instead of 3D map are well-suited for navigation aid
Visualisation of a three-dimensional (3D) object’s optimal reality in a 3D map on a mobile device
Prior research on the subject of visualisation of three-dimensional (3D) objects by coordinate systems has proved that all
objects are translated so that the eye is at the origin (eye space). The multiplication of a point in eye space leads to perspective space, and
dividing perspective space leads to screen space. This paper utilised these findings and investigated the key factor(s) in the visualisation
of 3D objects within 3D maps on mobile devices. The motivation of the study comes from the fact that there is a disparity between
3D objects within a 3D map on a mobile device and those on other devices; this difference might undermine the capabilities of a 3D
map view on a mobile device. This concern arises while interacting with a 3D map view on a mobile device. It is unclear whether
an increasing number of users will be able to identify the real world as the 3D map view on a mobile device becomes more realistic.
We used regression analysis intended to rigorously explain the participants’ responses and the Decision Making Trial and Evaluation
Laboratory method (DEMATEL) to select the key factor(s) that caused or were affected by 3D object views. The results of regression
analyses revealed that eye space, perspective space and screen space were associated with 3D viewing of 3D objects in 3D maps on
mobile devices and that eye space had the strongest impact. The results of DEMATEL using its original and revised version steps
showed that the prolonged viewing of 3D objects in a 3D map on mobile devices was the most important factor for eye space and a
long viewing distance was the most significant factor for perspective space, while large screen size was the most important factor for
screen space. In conclusion, a 3D map view on a mobile device allows for the visualisation of a more realistic environment
AN INTERACTIVE REMOTE VISUALIZATION SYSTEM FOR MOBILE APPLICATION ACCESS
This paper introduces a remote visualization approach that enables the visualization of data sets on mobile devices or in web environments. With this approach the necessary computing power can be outsourced to a server environment. The developed system allows the rendering of 2D and 3D graphics on mobile phones or web browsers with high quality independent of the size of the original data set. Compared to known terminal server or other proprietary remote systems our approach offers a very simple way to integrate with a large variety of applications which makes it useful for real-life application scenarios in business processes
Spartan Daily, May 2, 2019
Volume 152, Issue 40https://scholarworks.sjsu.edu/spartan_daily_2019/1039/thumbnail.jp
Designing for Mixed Reality Urban Exploration
This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned.Peer reviewe
Multi-user navigation: a 3D mobile device interactive support
Multi-User navigation within an environment with the aid of 3D mobile support provides end users with additional mobility thought and improves mobility services’ efficiency. A necessary approach of using mobile device for navigation aid is to display only a section of the view-front and to let users control the portion shown by conceptually moving on the orientation. There is a need for multiple users to be able to interact with themselves when they are within an environment and navigating with the aid of 3D mobile devices support, in order to meet-up with an appointment or to be aware of the locations of each other. Unfortunately, the predominant 3D mobile navigation system does not provide multi-user interactive services. Users cannot be aware of other users navigating within same environment using the same system on their mobile devices at the same time. This paper presents multi-user 3D mobile navigation system for providing multiple user awareness. The analysis of the results provides a unique visualization of multiple users using mobile devices to help them navigate to a target location by being aware of their whereabouts
Designing for Mixed Reality Urban Exploration
This paper introduces a design framework for mixed reality urban exploration (MRUE), based on a concrete implementation in a historical city. The framework integrates different modalities, such as virtual reality (VR), augmented reality (AR), and haptics-audio interfaces, as well as advanced features such as personalized recommendations, social exploration, and itinerary management. It permits to address a number of concerns regarding information overload, safety, and quality of the experience, which are not sufficiently tackled in traditional non-integrated approaches. This study presents an integrated mobile platform built on top of this framework and reflects on the lessons learned
Mobile three-dimensional city maps
Maps are visual representations of environments and the objects within, depicting their spatial relations. They are mainly used in navigation, where they act as external information sources, supporting observation and decision making processes. Map design, or the art-science of cartography, has led to simplification of the environment, where the naturally three-dimensional environment has been abstracted to a two-dimensional representation, populated with simple geometrical shapes and symbols. However, abstract representation requires a map reading ability.
Modern technology has reached the level where maps can be expressed in digital form, having selectable, scalable, browsable and updatable content. Maps may no longer even be limited to two dimensions, nor to an abstract form. When a real world based virtual environment is created, a 3D map is born. Given a realistic representation, would the user no longer need to interpret the map, and be able to navigate in an inherently intuitive manner? To answer this question, one needs a mobile test platform. But can a 3D map, a resource hungry real virtual environment, exist on such resource limited devices?
This dissertation approaches the technical challenges posed by mobile 3D maps in a constructive manner, identifying the problems, developing solutions and providing answers by creating a functional system. The case focuses on urban environments. First, optimization methods for rendering large, static 3D city models are researched and a solution provided by combining visibility culling, level-of-detail management and out-of-core rendering, suited for mobile 3D maps. Then, the potential of mobile networking is addressed, developing efficient and scalable methods for progressive content downloading and dynamic entity management. Finally, a 3D navigation interface is developed for mobile devices, and the research validated with measurements and field experiments.
It is found that near realistic mobile 3D city maps can exist in current mobile phones, and the rendering rates are excellent in 3D hardware enabled devices. Such 3D maps can also be transferred and rendered on-the-fly sufficiently fast for navigation use over cellular networks. Real world entities such as pedestrians or public transportation can be tracked and presented in a scalable manner. Mobile 3D maps are useful for navigation, but their usability depends highly on interaction methods - the potentially intuitive representation does not imply, for example, faster navigation than with a professional 2D street map. In addition, the physical interface limits the usability
A wearable multimodal interface for exploring urban points of interest
Locating points of interest (POIs) in cities is typically facilitated by visual aids such as paper maps, brochures, and mobile applications. However, these techniques require visual attention, which ideally should be on the surroundings. Non-visual techniques for navigating towards specific POIs typically lack support for free exploration of the city or more detailed guidance. To overcome these issues, we propose a multimodal, wearable system for alerting the user of nearby recommended POIs. The system, built around a tactile glove, provides audio-tactile cues when a new POI is in the vicinity, and more detailed information and guidance if the user expresses interest in this POI. We evaluated the system in a field study, comparing it to a visual baseline application. The encouraging results show that the glovebased system helps keep the attention on the surroundings and that its performance is on the same level as that of the baseline
- …