11 research outputs found
Polyphony: Programming Interfaces and Interactions with the Entity-Component-System Model
International audienceThis paper introduces a new Graphical User Interface (GUI) and Interaction framework based on the Entity-Component-System model (ECS). In this model, interactive elements (Entities) are characterized only by their data (Components). Behaviors are managed by continuously running processes (Systems) which select entities by the Components they possess. This model facilitates the handling of behaviors and promotes their reuse. It provides developers with a simple yet powerful composition pattern to build new interactive elements with Components. It materializes interaction devices as Entities and interaction techniques as a sequence of Systems operating on them. We present Polyphony, an experimental toolkit implementing this approach, and discuss our interpretation of the ECS model in the context of GUIs programming
Application du modÚle Entité-Composant-SystÚme à la programmation d'interactions
National audienceThis paper introduces a new GUI framework based on the Entity- Component-System model (ECS), where interactive elements (Entities) can acquire any data (Components). Behaviors are managed by continuously running processes (Systems) which select entities by the components they possess. This model facilitates the handling and reuse of behaviors. It allows to define the interaction modalities of an application globally, by formulating them as a set of Systems. We present Polyphony, an experimental toolkit implementing this approach, detail our interpretation of the ECS model in the context of GUIs, and demonstrate its use with a sample application.Cet article preÌsente un nouveau cadre de conception dâIHM baseÌ sur le modeÌle EntiteÌ-Composant-SysteÌme (ECS). Dans ce modeÌle, les eÌleÌments interactifs (EntiteÌs) acquieÌrent librement des donneÌes (Composants). Les comportements sont reÌgis par des processus communs sâexeÌcutant continuellement (SysteÌmes), qui seÌlection- nent les entiteÌs par les composants quâelles posseÌdent. Ce modeÌle favorise la manipulation et la reÌutilisation des comportements. Il permet de deÌfinir globalement les modaliteÌs dâinteraction dâune application, en les formulant par un ensemble de systeÌmes. Nous preÌsentons Polyphony, une boiÌte aÌ outils expeÌrimentale impleÌmen- tant cette approche, deÌtaillons notre interpreÌtation du modeÌle ECS en contexte IHM, et lâillustrons avec un exemple dâapplication
Phrasing Bimanual Interaction for Visual Design
Architects and other visual thinkers create external representations of their ideas to support early-stage design. They compose visual imagery with sketching to form abstract diagrams as representations. When working with digital media, they apply various visual operations to transform representations, often engaging in complex sequences. This research investigates how to build interactive capabilities to support designers in putting together, that is phrasing, sequences of operations using both hands. In particular, we examine how phrasing interactions with pen and multi-touch input can support modal switching among different visual operations that in many commercial design tools require using menus and tool palettesâtechniques originally designed for the mouse, not pen and touch.
We develop an interactive bimanual pen+touch diagramming environment and study its use in landscape architecture design studio education. We observe interesting forms of interaction that emerge, and how our bimanual interaction techniques support visual design processes. Based on the needs of architects, we develop LayerFish, a new bimanual technique for layering overlapping content. We conduct a controlled experiment to evaluate its efficacy. We explore the use of wearables to identify which user, and distinguish what hand, is touching to support phrasing together direct-touch interactions on large displays. From design and development of the environment and both field and controlled studies, we derive a set methods, based upon human bimanual specialization theory, for phrasing modal operations through bimanual interactions without menus or tool palettes
Beyond Snapping: Persistent, Tweakable Alignment and Distribution with StickyLines
International audienceAligning and distributing graphical objects is a common, but cumbersome task. In a preliminary study (six graphic designers , six non-designers), we identified three key problems with current tools: lack of persistence, unpredictability of results, and inability to 'tweak' the layout. We created StickyLines, a tool that treats guidelines as first-class objects: Users can create precise, predictable and persistent interactive alignment and distribution relationships, and 'tweaked' positions can be maintained for subsequent interactions. We ran a [2x2] within-participant experiment to compare Sticky-Lines with standard commands, with two levels of layout difficulty. StickyLines performed 40% faster and required 49% fewer actions than traditional alignment and distribution commands for complex layouts. In study three, six professional designers quickly adopted StickyLines and identified novel uses, including creating complex compound guidelines and using them for both spatial and semantic grouping
Contributions to the science of controlled transformation
writing completed in april 2013My research activities pertain to "Informatics" and in particular "Interactive Graphics" i.e. dynamic graphics on a 2D screen that a user can interact with by means of input devices such as a mouse or a multitouch surface. I have conducted research on Interactive Graphics along three themes: interactive graphics development (how should developers design the architecture of the code corresponding to graphical interactions?), interactive graphic design (what graphical interactions should User Experience (UX) specialists use in their system?) and interactive graphics design process (how should UX specialists design? Which method should they apply?) I invented the MDPC architecture that relies on Picking views and Inverse transforms. This improves the modularity of programs and improves the usability of the specification and the implementation of interactive graphics thanks to the simplification of description. In order to improve the performance of rich-graphic software using this architecture, I explored the concepts of graphical compilers and led a PhD thesis on the topic. The thesis explored the approach and contributed both in terms of description simplification and of software engineering facilitation. Finally, I have applied the simplification of description principles to the problem of shape covering avoidance by relying on new efficient hardware support for parallelized and memory-based algorithms. Together with my colleagues, we have explored the design and assessment of expanding targets, animation and sound, interaction with numerous tangled trajectories, multi-user interaction and tangible interaction. I have identified and defined Structural Interaction, a new interaction paradigm that follows the steps of the direct and instrumental interaction paradigms. I directed a PhD thesis on this topic and together with my student we designed and assessed interaction techniques for structural interaction. I was involved in the design of the "Technology Probes" concept i.e. runnable prototypes to feed the design process. Together with colleagues, I designed VideoProbe, one such Technology Probe. I became interested in more conceptual tools targeted at graphical representation. I led two PhD theses on the topic and explored the characterization of visualization, how to design representations with visual variables or ecological perception and how to design visual interfaces to improve visual scanning. I discovered that those conceptual tools could be applied to programming languages and showed how the representation of code, be it textual or "visual" undergoes visual perception phenomena. This has led me to consider our discipline as the "Science of Controlled Transformations". The fifth chapter is an attempt at providing this new account of "Informatics" based on what users, programmers and researchers actually do with interactive systems. I also describe how my work can be considered as contributing to the science of controlled transformations
Experimental Object-Oriented Modelling
This thesis examines object-oriented modelling in experimental system development. Object-oriented modelling aims at representing concepts and phenomena of a problem domain in terms of classes and objects. Experimental system development seeks active experimentation in a system development project through, e.g., technical prototyping and active user involvement. We introduce and examine "experimental object-oriented modelling" as the intersection of these practices
The svgl toolkit: enabling fast rendering of rich 2D graphics
As more and more powerful graphical processors be- come available on mainstream computers, it becomes possible to investigate the design of visually rich and fast interactive applications. In this article, we present S VGL , a graphical toolkit that enables programmers and design- ers of interactive applications to benefit from this power. The toolkit is based on a scene graph which is translated into an optimized display graph. After describing the algorithms used to display the scene, we show that the toolkit is two to fifty times faster than similar toolkits