Touch-Based Ontology Browsing on Tablets and Surfaces

Abstract

Semantic technologies and Linked Data are increasingly adopted as core application modules, in many knowledge domains and involving various stakeholders: ontology engineers, software architects, doctors, employees, etc. Such a diffusion calls for better access to models and data, which should be direct, mobile, visual and time effective. While a relevant core of research efforts investigated the problem of ontology visualization, discovering different paradigms, layouts, and interaction modalities, a few approaches target mobile devices such as tablets and smartphones. Touch interaction, indeed, has the potential of dramatically improving usability of Linked Data and of semantic-based solutions in real-world applications and mash-ups, by enabling direct and tactile interactions with involved knowledge objects. In this paper, we move a step towards touch-based, mobile interfaces for semantic models by presenting an ontology browsing platform for Android devices. We exploit state of the art touch-based interaction paradigms, e.g., pie menus, pinch-to-zoom, etc., to empower effective ontology browsing. Our research mainly focuses on interactions, yet providing support to different visualization approaches thanks to a clear decoupling between model-level operation and visual representations. Presented results include the design and implementation of a working prototype application, as well as a first validation involving habitual users of semantic technologies. Results show a low learning curve and positive reactions to the proposed paradigms, which are perceived as both innovative and useful

    Similar works