23 research outputs found

    User Interfaces based on Touchless Hand Gestures for the Classroom: A survey

    Get PDF
    (Received: 2014/10/31 - Accepted: 2014/12/15)The proliferation of new devices to detect human movements has produced an increase in the use of interfaces based on touchless hand gestures. These kind of applications may also be used in classrooms. Although a lot of studies have been carried out, most of them are not focused on classrooms. Therefore, this paper presents a bibliographic review about related studies with the aim of organizing and relating them to the interface design for this type of scenario. This review discusses some related applications, how gestures performed by users are recognized, design aspects to consider, and some evaluation methods for this interaction style. Thus, this work may be a reference guide to both researchers and software designers to develop and use such applications in classrooms

    Exploring Uni-manual Around Ear Off-Device Gestures for Earables

    Full text link
    Small form factor limits physical input space in earable (i.e., ear-mounted wearable) devices. Off-device earable inputs in alternate mid-air and on-skin around-ear interaction spaces using uni-manual gestures can address this input space limitation. Segmenting these alternate interaction spaces to create multiple gesture regions for reusing off-device gestures can expand earable input vocabulary by a large margin. Although prior earable interaction research has explored off-device gesture preferences and recognition techniques in such interaction spaces, supporting gesture reuse over multiple gesture regions needs further exploration. We collected and analyzed 7560 uni-manual gesture motion data from 18 participants to explore earable gesture reuse by segmentation of on-skin and mid-air spaces around the ear. Our results show that gesture performance degrades significantly beyond 3 mid-air and 5 on-skin around-ear gesture regions for different uni-manual gesture classes (e.g., swipe, pinch, tap). We also present qualitative findings on most and least preferred regions (and associated boundaries) by end-users for different uni-manual gesture shapes across both interaction spaces for earable devices. Our results complement earlier elicitation studies and interaction technologies for earables to help expand the gestural input vocabulary and potentially drive future commercialization of such devices.Comment: 30 pages, 15 figures, to be published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Volume 8, Issue 1 (March 2024

    AUGMENTED TOUCH INTERACTIONS WITH FINGER CONTACT SHAPE AND ORIENTATION

    Get PDF
    Touchscreen interactions are far less expressive than the range of touch that human hands are capable of - even considering technologies such as multi-touch and force-sensitive surfaces. Recently, some touchscreens have added the capability to sense the actual contact area of a finger on the touch surface, which provides additional degrees of freedom - the size and shape of the touch, and the finger's orientation. These additional sensory capabilities hold promise for increasing the expressiveness of touch interactions - but little is known about whether users can successfully use the new degrees of freedom. To provide this baseline information, we carried out a study with a finger-contact-sensing touchscreen, and asked participants to produce a range of touches and gestures with different shapes and orientations, with both one and two fingers. We found that people are able to reliably produce two touch shapes and three orientations across a wide range of touches and gestures - a result that was confirmed in another study that used the augmented touches for a screen lock application

    Memorability of pre-designed and user-defined gesture sets

    No full text
    We studied the memorability of free-form gesture sets for invoking actions. We compared three types of gesture sets: user-defined gesture sets, gesture sets designed by the authors, and random gesture sets in three studies with 33 participants in total. We found that user-defined gestures are easier to remember, both immediately after creation and on the next day (up to a 24% difference in recall rate compared to pre-designed gestures). We also discovered that the differences between gesture sets are mostly due to association errors (rather than gesture form errors), that participants prefer user-defined sets, and that they think user-defined gestures take less time to learn. Finally, we contribute a qualitative analysis of the tradeoffs involved in gesture type selection and share our data and a video corpus of 66 gesturesfor replicability and further analysis.<br/

    Memorability of pre-designed and user-defined gesture sets

    No full text
    We studied the memorability of free-form gesture sets for invoking actions. We compared three types of gesture sets: user-defined gesture sets, gesture sets designed by the authors, and random gesture sets in three studies with 33 participants in total. We found that user-defined gestures are easier to remember, both immediately after creation and on the next day (up to a 24% difference in recall rate compared to pre-designed gestures). We also discovered that the differences between gesture sets are mostly due to association errors (rather than gesture form errors), that participants prefer user-defined sets, and that they think user-defined gestures take less time to learn. Finally, we contribute a qualitative analysis of the tradeoffs involved in gesture type selection and share our data and a video corpus of 66 gestures for replicability and further analysis.PostprintPostprin

    A novel user-based gesture vocabulary for conceptual design

    Get PDF
    Research into hand gestures for human computer interaction has been prolific recently, but within it research on hand gestures for conceptual design has either focused on gestures that were defined by the researchers rather than the users, or those that were heavily influenced by what can be achieved using currently available technology. This paper reports on the study performed to identify a user elicited vocabulary of gestures for conceptual design, disassociated from the currently available technology, and its subsequent evaluation. The study included 44 product design engineering students (3rd, 4th year and recent graduates) and identified 1772 gestures that were analysed to build a novel gesture consensus set of vocabulary of hand gestures for conceptual design. This set is then evaluated by 10 other professionals, in order to generalise this set for a wider range of users and possibly reduce the need for training. The evaluation has shown that majority of gestures added to the vocabulary were easy to perform and appropriate for the activities, but that at the implementation stage the vocabulary will require another round of evaluation to account for the technology capabilities. The aim of this work is to create a starting point for a potential future system that could adapt to individual designers and allow them to use non-prescribed gestures that will support rather than inhibit their conceptual design thinking processes, akin to the developments that happened in hand writing recognition or predictive texting

    Desarrollo de una aplicaci贸n m贸vil con interfaz gestual para el env铆o de alarmas ante situaciones de riesgo

    Get PDF
    Lamentablemente, la inseguridad en la v铆a p煤blica en nuestros d铆as a alcanzado 铆ndices altos, siendo los robos uno de los incidentes m谩s comunes; los objetos m谩s robados: carteras, relojes, equipos de c贸mputo port谩tiles y tel茅fonos celulares. Generalmente, un asaltante es r谩pido e intimida al asaltado para despojarlo de sus bienes, evitando llamar la atenci贸n de los transe煤ntes y de que 茅ste pida ayuda. El tel茅fono celular es un dispositivo de comunicaci贸n port谩til y que esta en operaci贸n continua a pesar de tener bloqueada su pantalla, por lo que, generalmente es el primer objeto hurtado. Por otro lado, las generaciones recientes de tel茅fonos celular est谩n dise帽ados con una serie de sensores, los cuales permiten monitorear el entorno y hacen posible la activaci贸n de eventos con base en la informaci贸n capturada. Por consiguiente, dichos sensores pueden utilizarse para enviar notificaciones a trav茅s de una interfaz no convencional, podr铆a ser 煤til en una situaci贸n de robo. Esta tesis propone una aplicaci贸n que a trav茅s de una interfaz gestual env铆e una notificaci贸n y la ubicaci贸n registrada por el tel茅fono celular, para enterar a un contacto que el due帽o del celular se encuentra en una situaci贸n de riesgo. En el trabajo, se presenta una aplicaci贸n m贸vil para la plataforma operativa Android de las versiones 5.0 a la 9.0, la cual, contraria a otras plataformas operativas, no restringe el uso de los sensores del tel茅fono celular. Dicha aplicaci贸n m贸vil, est谩 codificada con los lenguajes de programaci贸n Java 1.8 y Extensible Markup Language (XML) y tiene como gestor de base de datos SQL SERVER 2012. Es importante destacar que, esta aplicaci贸n esta dise帽ada para tel茅fonos inteligentes, los cuales generalmente cuentan con un aceler贸metro como sensor de movimiento. Esta aplicaci贸n m贸vil esta dise帽ada con tres funcionalidades principales: instalaci贸n, configuraci贸n y notificaci贸n. La instalaci贸n hace uso de Google Drive como servidor de descarga, ya que 茅ste cuenta con acceso p煤blico y permite que cada usuario descargue e instale la aplicaci贸n desde su tel茅fono celular. La configuraci贸n consiste en el registro de datos personales del usuario, tales como nombre y correo electr贸nico; adicionalmente, el usuario define los datos de su contacto de emergencia, incluyendo su nombre y n煤mero de celular, as铆 como el mensaje de notificaci贸n a ser enviado. Finalmente, la notificaci贸n consiste en el env铆o de un mensaje de alarma al contacto de emergencia registrado. La caracter铆stica t茅cnica m谩s importante de la aplicaci贸n m贸vil propuesta en esta tesis, es el env铆o del mensaje de alarma a trav茅s de una interfaz gestual, la cual se activa con un movimiento natural de la mano del usuario. Este movimiento de mano consiste en realizar un giro de 180掳 cuidando que la pantalla del tel茅fono celular apunte hacia el suelo al momento de ser entregado al agresor. Entonces, dicho movimiento es le铆do y codificado por el aceler贸metro del dispositivo y env铆a al contacto de emergencia un mensaje SMS con el texto configurado, adem谩s de un hiperv铆nculo de la aplicaci贸n Google Maps activada con las coordenadas geogr谩ficas de localizaci贸n del usuario. 脡stas coordenadas geogr谩ficas son la latitud y longitud detectadas autom谩ticamente por el GPS del tel茅fono celular. La evaluaci贸n de la aplicaci贸n m贸vil propuesta en esta tesis, se realiza en la v铆a p煤blica de tres colonias del Estado de M茅xico: Santa Cruz Atzcapotzaltongo y Nueva Oxtotitl谩n del municipio de Toluca, y El Calvario del municipio de Zinacantepec, las cuales se distinguen por su 铆ndice delictivo medio alto. Las tareas definidas son evaluadas con dos m茅tricas de usabilidad de rendimiento y una de preferencia: Total de operaciones realizadas correctamente, tiempo que toma en realizar la tarea asignada, facilidad de uso de la aplicaci贸n. Dentro de los resultados obtenidos en la evaluaci贸n de la aplicaci贸n m贸vil se distingue lo siguiente. Para la tarea instalaci贸n de la aplicaci贸n se obtiene un tiempo medio de 37.2 segundos. En cuanto a la tarea configuraci贸n de la aplicaci贸n se obtiene un tiempo medio de 48.43 segundos. Con respecto a la tarea env铆o de la notificaci贸n se obtiene un n煤mero de operaciones realizadas correctamente del 100% y con 12 segundos de tiempo medio. Finalmente, la tarea registro de la alarma se obtiene un tiempo medio de 12 segundos
    corecore