18,939 research outputs found
Mobile, collaborative augmented reality using cloudlets
The evolution in mobile applications to support advanced interactivity and demanding multimedia features is still ongoing. Novel application concepts (e.g. mobile Augmented Reality (AR)) are however hindered by the inherently limited resources available on mobile platforms (not withstanding the dramatic performance increases of mobile hardware). Offloading resource intensive application components to the cloud, also known as "cyber foraging", has proven to be a valuable solution in a variety of scenarios. However, also for collaborative scenarios, in which data together with its processing are shared between multiple users, this offloading concept is highly promising. In this paper, we investigate the challenges posed by offloading collaborative mobile applications. We present a middleware platform capable of autonomously deploying software components to minimize average CPU load, while guaranteeing smooth collaboration. As a use case, we present and evaluate a collaborative AR application, offering interaction between users, the physical environment as well as with the virtual objects superimposed on this physical environment
Promising Beginning? Evaluating Museum Mobile Phone Apps
Since 2009 museums have started introducing mobile apps in their range of interpretative media and visitor services.
As mobile technology continues to develop and permeate all aspects of our life, and the capabilities of smart phones
increase while they become more accessible and popular, new possibilities arise for cultural institutions to exploit these
tools for communicating in new ways and promoting their exhibitions and programmes. The use of mobile apps opens
up new channels of communication between the cultural institution and the user, which extent to his or her personal
space and go beyond the boundaries of the museum’s walls. The paper presents a survey carried out of mobile apps
designed by art or cultural historical museums and analyses the wider issues which are raised by the findings. It
discusses, among others, the kind of use these apps were designed to fulfil (e.g. the majority are guided tours to the
permanent collections or to temporary exhibitions), the layering of content,and the type of user interaction and
involvement they support
DualStream: Spatially Sharing Selves and Surroundings using Mobile Devices and Augmented Reality
In-person human interaction relies on our spatial perception of each other
and our surroundings. Current remote communication tools partially address each
of these aspects. Video calls convey real user representations but without
spatial interactions. Augmented and Virtual Reality (AR/VR) experiences are
immersive and spatial but often use virtual environments and characters instead
of real-life representations. Bridging these gaps, we introduce DualStream, a
system for synchronous mobile AR remote communication that captures, streams,
and displays spatial representations of users and their surroundings.
DualStream supports transitions between user and environment representations
with different levels of visuospatial fidelity, as well as the creation of
persistent shared spaces using environment snapshots. We demonstrate how
DualStream can enable spatial communication in real-world contexts, and support
the creation of blended spaces for collaboration. A formative evaluation of
DualStream revealed that users valued the ability to interact spatially and
move between representations, and could see DualStream fitting into their own
remote communication practices in the near future. Drawing from these findings,
we discuss new opportunities for designing more widely accessible spatial
communication tools, centered around the mobile phone.Comment: 10 pages, 4 figures, 1 table; To appear in the proceedings of the
IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 202
Remote Real-Time Collaboration Platform enabled by the Capture, Digitisation and Transfer of Human-Workpiece Interactions
In this highly globalised manufacturing ecosystem, product design and verification activities, production and inspection processes, and technical support services are spread across global supply chains and customer networks. Therefore, a platform for global teams to collaborate with each other in real-time to perform complex tasks is highly desirable. This work investigates the design and development of a remote real-time collaboration platform by using human motion capture technology powered by infrared light based depth imaging sensors borrowed from the gaming industry. The unique functionality of the proposed platform is the sharing of physical contexts during a collaboration session by not only exchanging human actions but also the effects of those actions on the task environment. This enables teams to remotely work on a common task problem at the same time and also get immediate feedback from each other which is vital for collaborative design, inspection and verifications tasks in the factories of the future
- …