41 research outputs found

    Translating Italian to LIS in the Rail Stations

    Get PDF

    Designing natural gesture interaction for archaeological data in immersive environments

    Get PDF

    Guest Editorial. The Role of Gesture in Designing

    Get PDF
    International audienceThis paper introduces the special issue of AIEDAM on the role of gesture in designing. It starts with the context of the papers submitted and a summary of the papers accepted. We then introduce gesture studies, one of the two main domains with which this special issue is concerned. We do not introduce design research: we suppose the readers of AIEDAM are familiar with this domain. After this general introduction to the domain of gesture studies, we provide an overview of gestures in design, that is, the research environment of the papers in this special issue. We then discuss some dimensions on which these papers differ---and are related.Cet article constitue l'introduction à ce numéro spécial d'AIEDAM sur le rôle du geste dans la conception. Il commence avec une présentation du contexte des textes soumis et un résumé des articles acceptés. Nous introduisons ensuite les études sur le geste, l'un des deux principaux domaines auxquels ce numéro spécial est dédié. Nous n'introduisons pas la recherche sur la conception: nous supposons que les lecteurs d'AIEDAM sont familiers avec ce domaine. Après cette introduction générale, nous donnons un aperçu des gestes dans la conception, qui est le contexte de recherche des articles dans ce numéro spécial. Nous discutons ensuite des dimensions sur lesquelles ces textes diffèrent - et sont liés

    Designing natural gesture interaction for archaeological data in immersive environments

    Get PDF
    Archaeological data are heterogeneous, making it difficult to correlate and combine different types. Datasheets and pictures, stratigraphic data and 3D models, time and space mixed together: these are only a few of the categories a researcher has to deal with. New technologies may be able to help in this process and trying to solve research related problems needs innovative solutions. In this paper, we describe the whole process for the design and development of a prototype application that uses an Immersive Virtual Reality system to acces archaeological excavation 3D data through the Gesture Variation Follower (GVF) algorithm. This makes it possible to recognise which gesture is being performed and how it is performed. Archaeologists have participated actively in the design of the interface and the set of gestures used for triggering the different tasks. Interactive machine learning techniques have been used for the real time detection of the gestures. As a case study the agora of Segesta (Sicily, Italy) has been selected. Indeed, due to the complex architectural features and the still ongoing fieldwork activities, Segesta represents an ideal context where to test and develop a research approach integrating both traditional and more innovative tools and methods

    Les gestes dans des réunions de conception architecturale

    Get PDF
    L’importance d’autres modes d’expression et de représentation dans l’interaction que le « verbal » a été reconnue dans le domaine du « cognitive design research ». À quelques exceptions près, toutefois, la seule modalité « non verbale » étudiée a été le graphique ; le geste a reçu peu d’attention.Notre objectif à long terme est de déterminer la contribution de chacun des différents systèmes sémiotiques que des personnes collaborant sur une tâche adoptent dans leur interaction (modalités verbale, graphique, gestuelle et autres). Ici, nous étudions le geste dans la conception collaborative. Nous avons analysé une réunion de conception architecturale.Notre recherche sur la conception nous avait conduit à regarder la conception comme la construction de représentations. Dans nos analyses précédentes de conception collaborative, nous y avions distingué des activités de représentation et d’organisation. Ces études étaient, toutefois, basées principalement sur du verbal.Dans l’analyse présentée ici, nous avons observé que le geste aussi a ces deux utilisations. Il contribue à la construction de la représentation de l’artefact et à l’organisation des activités de conception et de l’interaction entre participants.Pour un certain nombre d’auteurs dans la recherche sur le geste, parole et geste constituent un système intégré. Notre analyse confirme que les gestes sont produits principalement dans une configuration co-verbale.Dans la conclusion, nous discutons les implications possibles de ces données pour différentes situations de travail collaboratif.The importance of other modes of expression and representation in the interaction than the “verbal” has been recognised in the field of cognitive design research. With few exceptions, however, the only “nonverbal” modality studied has been graphic, gesture having received little attention.Our long term goal is to determine the contribution of each of the different semiotic systems that people working on a task adopt in their interaction (especially, speech, graphic, and gestural modalities). Here, we focus on gesture in collaborative design. We analysed an architectural design meeting.Our research led us to consider design as the construction of representations. In our previous studies of collaborative design, we distinguished representational and organisational activities. These studies were, however, based primarily on the verbal.In the analysis presented here, we observed that gesture also has these two uses. It contributes to the construction of representations of the artefact and to the organisation of design activities and of interaction among participants.For a number of authors in the field of gesture research, speech and gesture are an integrated system. Our analysis confirms that gestures are mainly co-verbal.In conclusion, we discuss the results with respect to possible implications for different collaborative work situations and to their contribution to gesture studies and to cognitive design research

    Towards the creation of an annotation system and a digital archive platform for contemporary dance

    Get PDF
    Tese de mestrado. Multimédia. Universidade do Porto. Faculdade de Engenharia. 201

    Behaviour-aware mobile touch interfaces

    Get PDF
    Mobile touch devices have become ubiquitous everyday tools for communication, information, as well as capturing, storing and accessing personal data. They are often seen as personal devices, linked to individual users, who access the digital part of their daily lives via hand-held touchscreens. This personal use and the importance of the touch interface motivate the main assertion of this thesis: Mobile touch interaction can be improved by enabling user interfaces to assess and take into account how the user performs these interactions. This thesis introduces the new term "behaviour-aware" to characterise such interfaces. These behaviour-aware interfaces aim to improve interaction by utilising behaviour data: Since users perform touch interactions for their main tasks anyway, inferring extra information from said touches may, for example, save users' time and reduce distraction, compared to explicitly asking them for this information (e.g. user identity, hand posture, further context). Behaviour-aware user interfaces may utilise this information in different ways, in particular to adapt to users and contexts. Important questions for this research thus concern understanding behaviour details and influences, modelling said behaviour, and inference and (re)action integrated into the user interface. In several studies covering both analyses of basic touch behaviour and a set of specific prototype applications, this thesis addresses these questions and explores three application areas and goals: 1) Enhancing input capabilities – by modelling users' individual touch targeting behaviour to correct future touches and increase touch accuracy. The research reveals challenges and opportunities of behaviour variability arising from factors including target location, size and shape, hand and finger, stylus use, mobility, and device size. The work further informs modelling and inference based on targeting data, and presents approaches for simulating touch targeting behaviour and detecting behaviour changes. 2) Facilitating privacy and security – by observing touch targeting and typing behaviour patterns to implicitly verify user identity or distinguish multiple users during use. The research shows and addresses mobile-specific challenges, in particular changing hand postures. It also reveals that touch targeting characteristics provide useful biometric value both in the lab as well as in everyday typing. Influences of common evaluation assumptions are assessed and discussed as well. 3) Increasing expressiveness – by enabling interfaces to pass on behaviour variability from input to output space, studied with a keyboard that dynamically alters the font based on current typing behaviour. Results show that with these fonts users can distinguish basic contexts as well as individuals. They also explicitly control font influences for personal communication with creative effects. This thesis further contributes concepts and implemented tools for collecting touch behaviour data, analysing and modelling touch behaviour, and creating behaviour-aware and adaptive mobile touch interfaces. Together, these contributions support researchers and developers in investigating and building such user interfaces. Overall, this research shows how variability in mobile touch behaviour can be addressed and exploited for the benefit of the users. The thesis further discusses opportunities for transfer and reuse of touch behaviour models and information across applications and devices, for example to address tradeoffs of privacy/security and usability. Finally, the work concludes by reflecting on the general role of behaviour-aware user interfaces, proposing to view them as a way of embedding expectations about user input into interactive artefacts

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)
    corecore