3,111 research outputs found
Slovenian Virtual Gallery on the Internet
The Slovenian Virtual Gallery (SVG) is a World Wide Web based multimedia collection of pictures, text, clickable-maps and video clips presenting Slovenian ļ¬ne art from the gothic period up to the present days. Part of SVG is a virtual gallery space where pictures hang on the walls while another part is devoted to current exhibitions of selected Slovenian art galleries. The ļ¬rst version of this application was developed in the ļ¬rst half of 1995. It was based on a ļ¬le system for storing all the data and custom developed software for search, automatic generation of HTML documents, scaling of pictures and remote management of the system. Due to the fast development of Web related tools a new version of SVG was developed in 1997 based on object-oriented relational database server technology. Both implementations are presented and compared in this article with issues related to the transion between the two versions. At the end, we will also discuss some extensions to SVG. We will present the GUI (Graphical User Interface) developed specially for presentation of current exhibitions over the Web which is based on GlobalView panoramic navigation extension to developed Internet Video Server (IVS). And since SVG operates with a lot of image data, we will confront with the problem of Image Content Retrieval
Navigating Immersive and Interactive VR Environments With Connected 360Ā° Panoramas
Emerging research is expanding the idea of using 360-degree spherical panoramas of real-world environments for use in 360 VR experiences beyond video and image viewing. However, most of these experiences are strictly guided, with few opportunities for interaction or exploration. There is a desire to develop experiences with cohesive virtual environments created with 360 VR that allow for choice in navigation, versus scripted experiences with limited interaction. Unlike standard VR with the freedom of synthetic graphics, there are challenges in designing appropriate user interfaces (UIs) for 360 VR navigation within the limitations of fixed assets. To tackle this gap, we designed RealNodes, a software system that presents an interactive and explorable 360 VR environment. We also developed four visual guidance UIs for 360 VR navigation. The results of a pilot study showed that choice of UI had a significant effect on task completion times, showing one of our methods, Arrow, was best. Arrow also exhibited positive but non-significant trends in average measures with preference, user engagement, and simulator-sickness. RealNodes, the UI designs, and the pilot study results contribute preliminary information that inspire future investigation of how to design effective explorable scenarios in 360 VR and visual guidance metaphors for navigation in applications using 360 VR environments
A 360 VR and Wi-Fi Tracking Based Autonomous Telepresence Robot for Virtual Tour
This study proposes a novel mobile robot teleoperation interface that demonstrates the applicability of a robot-aided remote telepresence system with a virtual reality (VR) device to a virtual tour scenario. To improve realism and provide an intuitive replica of the remote environment for the user interface, the implemented system automatically moves a mobile robot (viewpoint) while displaying a 360-degree live video streamed from the robot to a VR device (Oculus Rift). Upon the user choosing a destination location from a given set of options, the robot generates a route based on a shortest path graph and travels along that the route using a wireless signal tracking method that depends on measuring the direction of arrival (DOA) of radio signals. This paper presents an overview of the system and architecture, and discusses its implementation aspects. Experimental results show that the proposed system is able to move to the destination stably using the signal tracking method, and that at the same time, the user can remotely control the robot through the VR interface
A magnetic internal mechanism for precise orientation of the camera in wireless endoluminal applications
Background and study aims: The use of magnetic
fields to control operative devices has been recently
described in endoluminal and transluminal
surgical applications. The exponential decrease of
magnetic field strength with distance has major
implications for precision of the remote control.
We aimed to assess the feasibility and functionality
of a novel wireless miniaturized mechanism,
based on magnetic forces, for precise orientation
of the camera.
Materials and methods: A remotely controllable
endoscopic capsule was developed as proof of
concept. Two intracapsular moveable permanent
magnets allow fine positioning, and an externally
applied magnetic field permits gross movement
and stabilization. Performance was assessed in ex
vivo and in vivo bench tests, using porcine upper
and lower gastrointestinal tracts.
Results: Fine control of capsule navigation and
rotation was achieved in all tests with an external
magnet held steadily about 15 cm from the capsule.
The camera could be rotated in steps of 1.8Ā°.
This was confirmed by ex vivo tests; the mechanism
could adjust the capsule view at 40 different
locations in a gastrointestinal tract phantom
model. Full 360Ā° viewing was possible in the gastric
cavity, while the maximal steering in the colonwas
45Ā° in total. In vivo, a similar performance
was verified, where the mechanism was successfully
operated every 5 cm for 40 cm in the colon,
visually sweeping from side to side of the lumen;
360Ā° views were obtained in the gastric fundus
and body, while antrally the luminal walls prevented
full rotation.
Conclusions: We report the feasibility and effectiveness
of the combined use of external static
magnetic fields and internal actuation to move
small permanent intracapsular magnets to
achieve wirelessly controllable and precise camera
steering. The concept is applicable to capsule
endoscopy as to other instrumentation for laparoscopic,
endoluminal, or transluminal procedures
Supervised Autonomous Locomotion and Manipulation for Disaster Response with a Centaur-like Robot
Mobile manipulation tasks are one of the key challenges in the field of
search and rescue (SAR) robotics requiring robots with flexible locomotion and
manipulation abilities. Since the tasks are mostly unknown in advance, the
robot has to adapt to a wide variety of terrains and workspaces during a
mission. The centaur-like robot Centauro has a hybrid legged-wheeled base and
an anthropomorphic upper body to carry out complex tasks in environments too
dangerous for humans. Due to its high number of degrees of freedom, controlling
the robot with direct teleoperation approaches is challenging and exhausting.
Supervised autonomy approaches are promising to increase quality and speed of
control while keeping the flexibility to solve unknown tasks. We developed a
set of operator assistance functionalities with different levels of autonomy to
control the robot for challenging locomotion and manipulation tasks. The
integrated system was evaluated in disaster response scenarios and showed
promising performance.Comment: In Proceedings of IEEE/RSJ International Conference on Intelligent
Robots and Systems (IROS), Madrid, Spain, October 201
- ā¦