4,663 research outputs found

    Collaborative Process Modeling with Tablets and Touch Tables — A Controlled Experiment

    Get PDF
    Collaborative process modeling involves business analysts and subject matter experts in order to properly capture and document process knowledge. In this context, appropriate tool support is required to motivate these user groups to actively participate in collaborative process modeling. This paper presents a collaborative process modeling tool that enables the experts to create, visualize and evolve process models based on multi-touch devices (e.g., tablets and touch tables). In particular, users may edit process models on their tablets and share the created or changed process models with other team members on a common touch table. For this purpose, a sophisticated and intuitive interaction concept is provided. Furthermore, results of a controlled experiment, evaluating the influence the use of tablets has on collaborative process modeling based on touch tables, are presented. Altogether the experimental results emphasize the high potential of multi-touch tools for collaborative process modeling

    Gesture-based Process Modeling Using Multi-Touch Devices

    Get PDF
    Contemporary business process modeling tools provide menu-based user in-terfaces for defining and visualizing process models. Such menu-based in-teractions have been optimized for applications running on desktop comput-ers, but are limited regarding their use on multi-touch devices. At the same time, the widespread use of mobile devices in daily business life as well as their multi-touch capabilities offer promising perspectives for intuitively de-fining and changing business process models. Additionally, multi-touch ta-bles will foster collaborative business process modeling based on natural as well as intuitive gestures and interactions. This paper presents the results of an experiment that investigated the way users define and change business process models using multi-touch devices. Based on experiment results, a core gesture set is designed enabling the easy definition and change of busi-ness process models with multi-touch devices. Finally, a proof-of-concept implementation of this core gesture set is presented. Overall, gesture-based process modeling and multi-touch devices will foster new ways of (collabo-rative) business process modeling

    Towards Gesture-based Process Modeling on Multi-Touch Devices

    Get PDF
    Contemporary tools for business process modeling use menu-based interfaces for visualizing process models and interacting with them. However, pure menu-based interactions have been optimized for applications running on desktop computers and are limited regarding their use on multi-touch devices. At the same time, the increasing distribution of mobile devices in business life as well as their multi-touch capabilities offer promising perspectives for intuitively defining and adapting business process models. Additionally, multi-touch tables could improve collaborative business process modeling based on natural gestures and interactions. In this paper we present the results of an experiment in which we investigate the way users model business processes with multi-touch devices. Furthermore, a core gesture set is suggested enabling the easy definition and adaption of business process models on these devices. Overall, gesture-based process modeling and multi-touch devices allow for new ways of (collaborative) business process modeling

    Supporting collaborative work using interactive tabletop

    Get PDF
    PhD ThesisCollaborative working is a key of success for organisations. People work together around tables at work, home, school, and coffee shops. With the explosion of the internet and computer systems, there are a variety of tools to support collaboration in groups, such as groupware, and tools that support online meetings. However, in the case of co-located meetings and face-to-face situations, facial expressions, body language, and the verbal communications have significant influence on the group decision making process. Often people have a natural preference for traditional pen-and-paper-based decision support solutions in such situations. Thus, it is a challenge to implement tools that rely advanced technological interfaces, such as interactive multi-touch tabletops, to support collaborative work. This thesis proposes a novel tabletop application to support group work and investigates the effectiveness and usability of the proposed system. The requirements for the developed system are based on a review of previous literature and also on requirements elicited from potential users. The innovative aspect of our system is that it allows the use of personal devices that allow some level of privacy for the participants in the group work. We expect that the personal devices may contribute to the effectiveness of the use of tabletops to support collaborative work. We chose for the purpose of evaluation experiment the collaborative development of mind maps by groups, which has been investigated earlier as a representative form of collaborative work. Two controlled laboratory experiments were designed to examine the usability features and associated emotional attitudes for the tabletop mind map application in comparison with the conventional pen-and-paper approach in the context of collaborative work. The evaluation clearly indicates that the combination of the tabletop and personal devices support and encourage multiple people working collaboratively. The comparison of the associated emotional attitudes indicates that the interactive tabletop facilitates the active involvement of participants in the group decision making significantly more than the use of the pen-and-paper conditions. The work reported here contributes significantly to our understanding of the usability and effectiveness of interactive tabletop applications in the context of supporting of collaborative work.The Royal Thai governmen

    Developing Digital Media Platforms for Early Design

    Get PDF
    In recent times, mobile devices are becoming an integral part of our daily life. Software applications on these handheld devices are successfully migrating the traditional paper-based activities such as reading news, books, and even navigating through maps, onto the digital medium. While these applications allow information access anywhere and anytime, there is still a necessity for repurposing these digital media to support content/information creation especially in domains such as industrial design where paper-based activities are common. To utilize direct-touch tablets for collaborative conceptual design, we studied their affordances and iteratively developed a web-based wiki system, named skWiki. In this thesis, we first report an evaluation of the impact of utilizing a capacitive stylus for tracing and sketching on direct-touch tablets. This study uncovers the differences in quantitative and qualitative performance of the tablet medium compared to the paper medium when using a stylus (pen) or finger input for both tracing and sketching. While paper performed better overall, we found that the tablet medium, when used with a capacitive stylus, performed comparably to the paper medium for sketching tasks. These findings can guide sketch application designers in developing an appropriate interaction design for various input methods. In order to explore the advantages of the ubiquity of information generated on digital media, we developed Sketchbox, an Android application for sketching and sharing ideas using Dropbox as the storage cloud. An evaluation of the usage patterns of this application in a collaborative toy design scenario provided necessary guidelines for developing the skWiki system. skWiki overcomes the drawbacks of traditional wiki software, that are used as design repositories, by providing a rich editor infrastructure for sketching, text editing, and image editing. Apart from these features, skWiki provides a higher degree of freedom in sharing (cloning, branching, and merging) different versions of a sketch at various data granularities by introducing the concept of paths for maintaining revisions in a collaborative design process. We evaluated the utility of skWiki through a user study by comparing constrained and unconstrained sharing models. Furthermore, skWiki was used by the students of toy design and product design courses for both collaborative ideation and design activities. We discuss the findings and qualitative feedback from the evaluation of skWiki, and potential features for the next version of this tool

    Spatial Interaction for Immersive Mixed-Reality Visualizations

    Get PDF
    Growing amounts of data, both in personal and professional settings, have caused an increased interest in data visualization and visual analytics. Especially for inherently three-dimensional data, immersive technologies such as virtual and augmented reality and advanced, natural interaction techniques have been shown to facilitate data analysis. Furthermore, in such use cases, the physical environment often plays an important role, both by directly influencing the data and by serving as context for the analysis. Therefore, there has been a trend to bring data visualization into new, immersive environments and to make use of the physical surroundings, leading to a surge in mixed-reality visualization research. One of the resulting challenges, however, is the design of user interaction for these often complex systems. In my thesis, I address this challenge by investigating interaction for immersive mixed-reality visualizations regarding three core research questions: 1) What are promising types of immersive mixed-reality visualizations, and how can advanced interaction concepts be applied to them? 2) How does spatial interaction benefit these visualizations and how should such interactions be designed? 3) How can spatial interaction in these immersive environments be analyzed and evaluated? To address the first question, I examine how various visualizations such as 3D node-link diagrams and volume visualizations can be adapted for immersive mixed-reality settings and how they stand to benefit from advanced interaction concepts. For the second question, I study how spatial interaction in particular can help to explore data in mixed reality. There, I look into spatial device interaction in comparison to touch input, the use of additional mobile devices as input controllers, and the potential of transparent interaction panels. Finally, to address the third question, I present my research on how user interaction in immersive mixed-reality environments can be analyzed directly in the original, real-world locations, and how this can provide new insights. Overall, with my research, I contribute interaction and visualization concepts, software prototypes, and findings from several user studies on how spatial interaction techniques can support the exploration of immersive mixed-reality visualizations.Zunehmende Datenmengen, sowohl im privaten als auch im beruflichen Umfeld, führen zu einem zunehmenden Interesse an Datenvisualisierung und visueller Analyse. Insbesondere bei inhärent dreidimensionalen Daten haben sich immersive Technologien wie Virtual und Augmented Reality sowie moderne, natürliche Interaktionstechniken als hilfreich für die Datenanalyse erwiesen. Darüber hinaus spielt in solchen Anwendungsfällen die physische Umgebung oft eine wichtige Rolle, da sie sowohl die Daten direkt beeinflusst als auch als Kontext für die Analyse dient. Daher gibt es einen Trend, die Datenvisualisierung in neue, immersive Umgebungen zu bringen und die physische Umgebung zu nutzen, was zu einem Anstieg der Forschung im Bereich Mixed-Reality-Visualisierung geführt hat. Eine der daraus resultierenden Herausforderungen ist jedoch die Gestaltung der Benutzerinteraktion für diese oft komplexen Systeme. In meiner Dissertation beschäftige ich mich mit dieser Herausforderung, indem ich die Interaktion für immersive Mixed-Reality-Visualisierungen im Hinblick auf drei zentrale Forschungsfragen untersuche: 1) Was sind vielversprechende Arten von immersiven Mixed-Reality-Visualisierungen, und wie können fortschrittliche Interaktionskonzepte auf sie angewendet werden? 2) Wie profitieren diese Visualisierungen von räumlicher Interaktion und wie sollten solche Interaktionen gestaltet werden? 3) Wie kann räumliche Interaktion in diesen immersiven Umgebungen analysiert und ausgewertet werden? Um die erste Frage zu beantworten, untersuche ich, wie verschiedene Visualisierungen wie 3D-Node-Link-Diagramme oder Volumenvisualisierungen für immersive Mixed-Reality-Umgebungen angepasst werden können und wie sie von fortgeschrittenen Interaktionskonzepten profitieren. Für die zweite Frage untersuche ich, wie insbesondere die räumliche Interaktion bei der Exploration von Daten in Mixed Reality helfen kann. Dabei betrachte ich die Interaktion mit räumlichen Geräten im Vergleich zur Touch-Eingabe, die Verwendung zusätzlicher mobiler Geräte als Controller und das Potenzial transparenter Interaktionspanels. Um die dritte Frage zu beantworten, stelle ich schließlich meine Forschung darüber vor, wie Benutzerinteraktion in immersiver Mixed-Reality direkt in der realen Umgebung analysiert werden kann und wie dies neue Erkenntnisse liefern kann. Insgesamt trage ich mit meiner Forschung durch Interaktions- und Visualisierungskonzepte, Software-Prototypen und Ergebnisse aus mehreren Nutzerstudien zu der Frage bei, wie räumliche Interaktionstechniken die Erkundung von immersiven Mixed-Reality-Visualisierungen unterstützen können

    Abstraction, Visualization, and Evolution of Process Models

    Get PDF
    The increasing adoption of process orientation in companies and organizations has resulted in large process model collections. Each process model of such a collection may comprise dozens or hundreds of elements and captures various perspectives of a business process, i.e., organizational, functional, control, resource, or data perspective. Domain experts having only limited process modeling knowledge, however, hardly comprehend such large and complex process models. Therefore, they demand for a customized (i.e., personalized) view on business processes enabling them to optimize and evolve process models effectively. This thesis contributes the proView framework to systematically create and update process views (i.e., abstractions) on process models and business processes respectively. More precisely, process views abstract large process models by hiding or combining process information. As a result, they provide an abstracted, but personalized representation of process information to domain experts. In particular, updates of a process view are supported, which are then propagated to the related process model as well as associated process views. Thereby, up-to-dateness and consistency of all process views defined on any process model can be always ensured. Finally, proView preserves the behaviour and correctness of a process model. Process abstractions realized by views are still not sufficient to assist domain experts in comprehending and evolving process models. Thus, additional process visualizations are introduced that provide text-based, form-based, and hierarchical representations of process models. Particularly, these process visualizations allow for view-based process abstractions and updates as well. Finally, process interaction concepts are introduced enabling domain experts to create and evolve process models on touch-enabled devices. This facilitates the documentation of process models in workshops or while interviewing process participants at their workplace. Altogether, proView enables domain experts to interact with large and complex process models as well as to evolve them over time, based on process model abstractions, additional process visualizations, and process interaction concepts. The framework is implemented in a proof-ofconcept prototype and validated through experiments and case studies

    A Utility Framework for Selecting Immersive Interactive Capability and Technology for Virtual Laboratories

    Get PDF
    There has been an increase in the use of virtual reality (VR) technology in the education community since VR is emerging as a potent educational tool that offers students with a rich source of educational material and makes learning exciting and interactive. With a rise of popularity and market expansion in VR technology in the past few years, a variety of consumer VR electronics have boosted educators and researchers’ interest in using these devices for practicing engineering and science laboratory experiments. However, little is known about how such devices may be well-suited for active learning in a laboratory environment. This research aims to address this gap by formulating a utility framework to help educators and decision-makers efficiently select a type of VR device that matches with their design and capability requirements for their virtual laboratory blueprint. Furthermore, a framework use case is demonstrated by not only surveying five types of VR devices ranging from low-immersive to full-immersive along with their capabilities (i.e., hardware specifications, cost, and availability) but also considering the interaction techniques in each VR device based on the desired laboratory task. To validate the framework, a research study is carried out to compare these five VR devices and investigate which device can provide an overall best-fit for a 3D virtual laboratory content that we implemented based on the interaction level, usability and performance effectiveness
    • …
    corecore