1 research outputs found

    A model-driven method for gesture-based interface requirements specification

    No full text
    Currently there are several software development suites that include tools for user interface design and implementation (mainly by programming source code). Some of the tools are multi-platform and multi-style; that is, al-lowing the specification of devices, e.g. computer, notebook, smartphone, and user interaction styles, e.g. based on gestures, voice, mouse and keyboard. Among the styles, gesture-based interaction is neglected, despite the proliferation of gesture-recognizing devices. Given the variety of styles of human-computer interaction currently available, it is necessary to include information on these styles in software requirements specification to obtain a complete spec-ification prior to code generation. In this paper, we propose the design of a model-driven method and tool that allows specifying gesture-based interactions and then generates gesture-based interface requirements specification. We in-tend to our proposal be interoperable with existing methods and tools. The re-search method follows design science and we plan to validate our proposals by means of technical action-research.Esse
    corecore