37 research outputs found

    Comparison of E-Textile Techniques and Materials for 3D Gesture Sensor with Boosted Electrode Design

    Full text link
    [EN] There is an interest in new wearable solutions that can be directly worn on the curved human body or integrated into daily objects. Textiles offer properties that are suitable to be used as holders for electronics or sensors components. Many sensing technologies have been explored considering textiles substrates in combination with conductive materials in the last years. In this work, a novel solution of a gesture recognition touchless sensor is implemented with satisfactory results. Moreover, three manufacturing techniques have been considered as alternatives: screen-printing with conductive ink, embroidery with conductive thread and thermosealing with conductive fabric. The main critical parameters have been analyzed for each prototype including the sensitivity of the sensor, which is an important and specific parameter of this type of sensor. In addition, user validation has been performed, testing several gestures with different subjects. During the tests carried out, flick gestures obtained detection rates from 79% to 89% on average. Finally, in order to evaluate the stability and strength of the solutions, some tests have been performed to assess environmental variations and washability deteriorations. The obtained results are satisfactory regarding temperature and humidity variations. The washability tests revealed that, except for the screen-printing prototype, the sensors can be washed with minimum degradation.This work was supported by the Spanish Government/FEDER funds (RTI2018-100910-B-C43) (MINECO/FEDER). The work presented is also funded by the Conselleria d'Economia Sostenible, Sectors Productius i Treball, through IVACE (Instituto Valenciano de Competitividad Empresarial) and cofounded by ERDF funding from the EU. Application No.: IMAMCI/2020/1Ferri Pascual, J.; Llinares Llopis, R.; Martinez, G.; Lidon-Roger, JV.; Garcia-Breijo, E. (2020). Comparison of E-Textile Techniques and Materials for 3D Gesture Sensor with Boosted Electrode Design. Sensors. 20(8):1-19. https://doi.org/10.3390/s20082369S11920

    Interactive gesture controller for a motorised wheelchair

    Get PDF
    This paper explores in great detail the design and testing of a gesture controller for a motorised wheelchair. For some, motorised wheelchairs are part of their everyday life. Those individuals who depend on their motorised wheelchair do so for a vast range of reasons; therefore, it is reasonable to assume that modifying and improving upon the standard joystick controller for a motorised wheelchair can benefit a person’s way of life significantly. The design of the gesture controller is heavily based around the user’s needs so as to benefit them and compliment their strengths to give them more control. For individuals with limited movement and dexterity, the user interface, system responsiveness, ergonomics and safety were considered when engineering a system that is intended for people to use. A device capable of recognising a hand gesture was carefully chosen. The technology that is readily available for this application is relatively new and not extensively documented. The LEAP motion sensor was chosen as the hand gesture recognition device to be the controller for a wheelchair. This device has hand recognition software but the device’s software lacks the predictability and accuracy required for a motorised wheelchair controller. Through testing, the controller accuracy improved. Although this controller is adequate for a laboratory environment, further testing and development will be required for this alternative wheelchair controller to evolve into a commercial product. The gesture triggered controller was designed around the capabilities of the developer’s hand; but the method outlined in this paper is transferable to any individual hand size and more importantly the limitations of their hand gestures. The outcome of this thesis is a customised, non-invasive hand gesture controller for a motorised wheelchair that is able to be fully tailored to a person’s capability without losing it responsiveness or accuracy

    Tactile and Touchless Sensors Printed on Flexible Textile Substrates for Gesture Recognition

    Full text link
    Tesis por compendio[EN] The main objective of this thesis is the development of new sensors and actuators using Printed Electronics technology. For this, conductive, semiconductor and dielectric polymeric materials are used on flexible and/or elastic substrates. By means of suitable designs and application processes, it is possible to manufacture sensors capable of interacting with the environment. In this way, specific sensing functionalities can be incorporated into the substrates, such as textile fabrics. Additionally, it is necessary to include electronic systems capable of processing the data obtained, as well as its registration. In the development of these sensors and actuators, the physical properties of the different materials are precisely combined. For this, multilayer structures are designed where the properties of some materials interact with those of others. The result is a sensor capable of capturing physical variations of the environment, and convert them into signals that can be processed, and finally transformed into data. On the one hand, a tactile sensor printed on textile substrate for 2D gesture recognition was developed. This sensor consists of a matrix composed of small capacitive sensors based on a capacitor type structure. These sensors were designed in such a way that, if a finger or other object with capacitive properties, gets close enough, its behaviour varies, and it can be measured. The small sensors are arranged in this matrix as in a grid. Each sensor has a position that is determined by a row and a column. The capacity of each small sensor is periodically measured in order to assess whether significant variations have been produced. For this, it is necessary to convert the sensor capacity into a value that is subsequently digitally processed. On the other hand, to improve the effectiveness in the use of the developed 2D touch sensors, the way of incorporating an actuator system was studied. Thereby, the user receives feedback that the order or action was recognized. To achieve this, the capacitive sensor grid was complemented with an electroluminescent screen printed as well. The final prototype offers a solution that combines a 2D tactile sensor with an electroluminescent actuator on a printed textile substrate. Next, the development of a 3D gesture sensor was carried out using a combination of sensors also printed on textile substrate. In this type of 3D sensor, a signal is sent generating an electric field on the sensors. This is done using a transmission electrode located very close to them. The generated field is received by the reception sensors and converted to electrical signals. For this, the sensors are based on electrodes that act as receivers. If a person places their hands within the emission area, a disturbance of the electric field lines is created. This is due to the deviation of the lines to ground using the intrinsic conductivity of the human body. This disturbance affects the signals received by the electrodes. Variations captured by all electrodes are processed together and can determine the position and movement of the hand on the sensor surface. Finally, the development of an improved 3D gesture sensor was carried out. As in the previous development, the sensor allows contactless gesture detection, but increasing the detection range. In addition to printed electronic technology, two other textile manufacturing technologies were evaluated.[ES] La presente tesis doctoral tiene como objetivo fundamental el desarrollo de nuevos sensores y actuadores empleando la tecnología electrónica impresa, también conocida como Printed Electronics. Para ello, se emplean materiales poliméricos conductores, semiconductores y dieléctricos sobre sustratos flexibles y/o elásticos. Por medio de diseños y procesos de aplicación adecuados, es posible fabricar sensores capaces de interactuar con el entorno. De este modo, se pueden incorporar a los sustratos, como puedan ser tejidos textiles, funcionalidades específicas de medición del entorno y de respuesta ante cambios de este. Adicionalmente, es necesario incluir sistemas electrónicos, capaces de realizar el procesado de los datos obtenidos, así como de su registro. En el desarrollo de estos sensores y actuadores se combinan las propiedades físicas de los diferentes materiales de forma precisa. Para ello, se diseñan estructuras multicapa donde las propiedades de unos materiales interaccionan con las de los demás. El resultado es un sensor capaz de captar variaciones físicas del entorno, y convertirlas en señales que pueden ser procesadas y transformadas finalmente en datos. Por una parte, se ha desarrollado un sensor táctil impreso sobre sustrato textil para reconocimiento de gestos en 2D. Este sensor se compone de una matriz formada por pequeños sensores capacitivos basados en estructura de tipo condensador. Estos se han diseñado de forma que, si un dedo u otro objeto con propiedades capacitivas se aproxima suficientemente, su comportamiento varía, pudiendo ser medido. Los pequeños sensores están ordenados en dicha matriz como en una cuadrícula. Cada sensor tiene una posición que viene determinada por una fila y por una columna. Periódicamente se mide la capacidad de cada pequeño sensor con el fin de evaluar si ha sufrido variaciones significativas. Para ello es necesario convertir la capacidad del sensor en un valor que posteriormente es procesado digitalmente. Por otro lado, con el fin de mejorar la efectividad en el uso de los sensores táctiles 2D desarrollados, se ha estudiado el modo de incorporar un sistema actuador. De esta forma, el usuario recibe una retroalimentación indicando que la orden o acción ha sido reconocida. Para ello, se ha complementado la matriz de sensores capacitivos con una pantalla electroluminiscente también impresa. El resultado final ofrece una solución que combina un sensor táctil 2D con un actuador electroluminiscente realizado mediante impresión electrónica sobre sustrato textil. Posteriormente, se ha llevado a cabo el desarrollo de un sensor de gestos 3D empleando una combinación de sensores impresos también sobre sustrato textil. En este tipo de sensor 3D, se envía una señal que genera un campo eléctrico sobre los sensores impresos. Esto se lleva a cabo mediante un electrodo de transmisión situado muy cerca de ellos. El campo generado es recibido por los sensores y convertido a señales eléctricas. Para ello, los sensores se basan en electrodos que actúan de receptores. Si una persona coloca su mano dentro del área de emisión, se crea una perturbación de las líneas de los campos eléctricos. Esto es debido a la desviación de las líneas de campo a tierra utilizando la conductividad intrínseca del cuerpo humano. Esta perturbación cambia/afecta a las señales recibidas por los electrodos. Las variaciones captadas por todos los electrodos son procesadas de forma conjunta pudiendo determinar la posición y el movimiento de la mano sobre la superficie del sensor. Finalmente, se ha llevado a cabo el desarrollo de un sensor de gestos 3D mejorado. Al igual que el desarrollo anterior, permite la detección de gestos sin necesidad de contacto, pero incrementando la distancia de alcance. Además de la tecnología de impresión electrónica, se ha evaluado el empleo de otras dos tecnologías de fabricación textil.[CA] La present tesi doctoral té com a objectiu fonamental el desenvolupament de nous sensors i actuadors fent servir la tecnologia de electrònica impresa, també coneguda com Printed Electronics. Es va fer us de materials polimèrics conductors, semiconductors i dielèctrics sobre substrats flexibles i/o elàstics. Per mitjà de dissenys i processos d'aplicació adequats, és possible fabricar sensors capaços d'interactuar amb l'entorn. D'aquesta manera, es poden incorporar als substrats, com ara teixits tèxtils, funcionalitats específiques de mesurament de l'entorn i de resposta davant canvis d'aquest. Addicionalment, és necessari incloure sistemes electrònics, capaços de realitzar el processament de les dades obtingudes, així com del seu registre. En el desenvolupament d'aquests sensors i actuadors es combinen les propietats físiques dels diferents materials de forma precisa. Cal dissenyar estructures multicapa on les propietats d'uns materials interaccionen amb les de la resta. manera El resultat es un sensor capaç de captar variacions físiques de l'entorn, i convertirles en senyals que poden ser processades i convertides en dades. D'una banda, s'ha desenvolupat un sensor tàctil imprès sobre substrat tèxtil per a reconeixement de gestos en 2D. Aquest sensor es compon d'una matriu formada amb petits sensors capacitius basats en una estructura de tipus condensador. Aquests s'han dissenyat de manera que, si un dit o un altre objecte amb propietats capacitives s'aproxima prou, el seu comportament varia, podent ser mesurat. Els petits sensors estan ordenats en aquesta matriu com en una quadrícula. Cada sensor té una posició que ve determinada per una fila i per una columna. Periòdicament es mesura la capacitat de cada petit sensor per tal d'avaluar si ha sofert variacions significatives. Per a això cal convertir la capacitat del sensor a un valor que posteriorment és processat digitalment. D'altra banda, per tal de millorar l'efectivitat en l'ús dels sensors tàctils 2D desenvolupats, s'ha estudiat la manera d'incorporar un sistema actuador. D'aquesta forma, l'usuari rep una retroalimentació indicant que l'ordre o acció ha estat reconeguda. Per a això, s'ha complementat la matriu de sensors capacitius amb una pantalla electroluminescent també impresa. El resultat final ofereix una solució que combina un sensor tàctil 2D amb un actuador electroluminescent realitzat mitjançant impressió electrònica sobre substrat tèxtil. Posteriorment, s'ha dut a terme el desenvolupament d'un sensor de gestos 3D emprant una combinació d'un mínim de sensors impresos també sobre substrat tèxtil. En aquest tipus de sensor 3D, s'envia un senyal que genera un camp elèctric sobre els sensors impresos. Això es porta a terme mitjançant un elèctrode de transmissió situat molt a proper a ells. El camp generat és rebut pels sensors i convertit a senyals elèctrics. Per això, els sensors es basen en elèctrodes que actuen de receptors. Si una persona col·loca la seva mà dins de l'àrea d'emissió, es crea una pertorbació de les línies dels camps elèctrics. Això és a causa de la desviació de les línies de camp a terra utilitzant la conductivitat intrínseca de el cos humà. Aquesta pertorbació afecta als senyals rebudes pels elèctrodes. Les variacions captades per tots els elèctrodes són processades de manera conjunta per determinar la posició i el moviment de la mà sobre la superfície del sensor. Finalment, s'ha dut a terme el desenvolupament d'un sensor de gestos 3D millorat. A l'igual que el desenvolupament anterior, permet la detecció de gestos sense necessitat de contacte, però incrementant la distància d'abast. A més a més de la tecnologia d'impressió electrònica, s'ha avaluat emprar altres dues tecnologies de fabricació tèxtil.Ferri Pascual, J. (2020). Tactile and Touchless Sensors Printed on Flexible Textile Substrates for Gesture Recognition [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/153075TESISCompendi

    A Wearable Textile 3D Gesture Recognition Sensor Based on Screen-Printing Technology

    Full text link
    [EN] Research has developed various solutions in order for computers to recognize hand gestures in the context of human machine interface (HMI). The design of a successful hand gesture recognition system must address functionality and usability. The gesture recognition market has evolved from touchpads to touchless sensors, which do not need direct contact. Their application in textiles ranges from the field of medical environments to smart home applications and the automotive industry. In this paper, a textile capacitive touchless sensor has been developed by using screen-printing technology. Two different designs were developed to obtain the best configuration, obtaining good results in both cases. Finally, as a real application, a complete solution of the sensor with wireless communications is presented to be used as an interface for a mobile phone.The work presented is funded by the Conselleria d'Economia Sostenible, Sectors Productius i Treball, through IVACE (Instituto Valenciano de Competitividad Empresarial) and cofounded by ERDF funding from the EU. Application No.: IMAMCI/2019/1. This work was also supported by the Spanish Government/FEDER funds (RTI2018-100910-B-C43) (MINECO/FEDER).Ferri Pascual, J.; Llinares Llopis, R.; Moreno Canton, J.; Ibáñez Civera, FJ.; Garcia-Breijo, E. (2019). A Wearable Textile 3D Gesture Recognition Sensor Based on Screen-Printing Technology. Sensors. 19(23):1-32. https://doi.org/10.3390/s19235068S1321923Chakraborty, B. K., Sarma, D., Bhuyan, M. K., & MacDorman, K. F. (2017). Review of constraints on vision‐based gesture recognition for human–computer interaction. IET Computer Vision, 12(1), 3-15. doi:10.1049/iet-cvi.2017.0052Zhang, Z. (2012). Microsoft Kinect Sensor and Its Effect. IEEE Multimedia, 19(2), 4-10. doi:10.1109/mmul.2012.24Rautaray, S. S. (2012). Real Time Hand Gesture Recognition System for Dynamic Applications. International Journal of UbiComp, 3(1), 21-31. doi:10.5121/iju.2012.3103Karim, R. A., Zakaria, N. F., Zulkifley, M. A., Mustafa, M. M., Sagap, I., & Md Latar, N. H. (2013). Telepointer technology in telemedicine : a review. BioMedical Engineering OnLine, 12(1), 21. doi:10.1186/1475-925x-12-21Santos, L., Carbonaro, N., Tognetti, A., González, J., de la Fuente, E., Fraile, J., & Pérez-Turiel, J. (2018). Dynamic Gesture Recognition Using a Smart Glove in Hand-Assisted Laparoscopic Surgery. Technologies, 6(1), 8. doi:10.3390/technologies6010008Singh, A., Buonassisi, J., & Jain, S. (2014). Autonomous Multiple Gesture Recognition System for Disabled People. International Journal of Image, Graphics and Signal Processing, 6(2), 39-45. doi:10.5815/ijigsp.2014.02.05Ohn-Bar, E., & Trivedi, M. M. (2014). Hand Gesture Recognition in Real Time for Automotive Interfaces: A Multimodal Vision-Based Approach and Evaluations. IEEE Transactions on Intelligent Transportation Systems, 15(6), 2368-2377. doi:10.1109/tits.2014.2337331Khan, S. A., & Engelbrecht, A. P. (2010). A fuzzy particle swarm optimization algorithm for computer communication network topology design. Applied Intelligence, 36(1), 161-177. doi:10.1007/s10489-010-0251-2Abraham, L., Urru, A., Normani, N., Wilk, M., Walsh, M., & O’Flynn, B. (2018). Hand Tracking and Gesture Recognition Using Lensless Smart Sensors. Sensors, 18(9), 2834. doi:10.3390/s18092834Zeng, Q., Kuang, Z., Wu, S., & Yang, J. (2019). A Method of Ultrasonic Finger Gesture Recognition Based on the Micro-Doppler Effect. Applied Sciences, 9(11), 2314. doi:10.3390/app9112314Lien, J., Gillian, N., Karagozler, M. E., Amihood, P., Schwesig, C., Olson, E., … Poupyrev, I. (2016). Soli. ACM Transactions on Graphics, 35(4), 1-19. doi:10.1145/2897824.2925953Sang, Y., Shi, L., & Liu, Y. (2018). Micro Hand Gesture Recognition System Using Ultrasonic Active Sensing. IEEE Access, 6, 49339-49347. doi:10.1109/access.2018.2868268Ferri, J., Lidón-Roger, J., Moreno, J., Martinez, G., & Garcia-Breijo, E. (2017). A Wearable Textile 2D Touchpad Sensor Based on Screen-Printing Technology. Materials, 10(12), 1450. doi:10.3390/ma10121450Nunes, J., Castro, N., Gonçalves, S., Pereira, N., Correia, V., & Lanceros-Mendez, S. (2017). Marked Object Recognition Multitouch Screen Printed Touchpad for Interactive Applications. Sensors, 17(12), 2786. doi:10.3390/s17122786Ferri, J., Perez Fuster, C., Llinares Llopis, R., Moreno, J., & Garcia‑Breijo, E. (2018). Integration of a 2D Touch Sensor with an Electroluminescent Display by Using a Screen-Printing Technology on Textile Substrate. Sensors, 18(10), 3313. doi:10.3390/s18103313Cronin, S., & Doherty, G. (2018). Touchless computer interfaces in hospitals: A review. Health Informatics Journal, 25(4), 1325-1342. doi:10.1177/1460458217748342Haslinger, L., Wasserthal, S., & Zagar, B. G. (2017). P3.1 - A capacitive measurement system for gesture regocnition. Proceedings Sensor 2017. doi:10.5162/sensor2017/p3.1Cherenack, K., & van Pieterson, L. (2012). Smart textiles: Challenges and opportunities. Journal of Applied Physics, 112(9), 091301. doi:10.1063/1.474272

    Principal components analysis based control of a multi-dof underactuated prosthetic hand

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Functionality, controllability and cosmetics are the key issues to be addressed in order to accomplish a successful functional substitution of the human hand by means of a prosthesis. Not only the prosthesis should duplicate the human hand in shape, functionality, sensorization, perception and sense of body-belonging, but it should also be controlled as the natural one, in the most intuitive and undemanding way. At present, prosthetic hands are controlled by means of non-invasive interfaces based on electromyography (EMG). Driving a multi degrees of freedom (DoF) hand for achieving hand dexterity implies to selectively modulate many different EMG signals in order to make each joint move independently, and this could require significant cognitive effort to the user.</p> <p>Methods</p> <p>A Principal Components Analysis (PCA) based algorithm is used to drive a 16 DoFs underactuated prosthetic hand prototype (called CyberHand) with a two dimensional control input, in order to perform the three prehensile forms mostly used in Activities of Daily Living (ADLs). Such Principal Components set has been derived directly from the artificial hand by collecting its sensory data while performing 50 different grasps, and subsequently used for control.</p> <p>Results</p> <p>Trials have shown that two independent input signals can be successfully used to control the posture of a real robotic hand and that correct grasps (in terms of involved fingers, stability and posture) may be achieved.</p> <p>Conclusions</p> <p>This work demonstrates the effectiveness of a bio-inspired system successfully conjugating the advantages of an underactuated, anthropomorphic hand with a PCA-based control strategy, and opens up promising possibilities for the development of an intuitively controllable hand prosthesis.</p

    A Review of Evaluation Practices of Gesture Generation in Embodied Conversational Agents

    Full text link
    Embodied Conversational Agents (ECA) take on different forms, including virtual avatars or physical agents, such as a humanoid robot. ECAs are often designed to produce nonverbal behaviour to complement or enhance its verbal communication. One form of nonverbal behaviour is co-speech gesturing, which involves movements that the agent makes with its arms and hands that is paired with verbal communication. Co-speech gestures for ECAs can be created using different generation methods, such as rule-based and data-driven processes. However, reports on gesture generation methods use a variety of evaluation measures, which hinders comparison. To address this, we conducted a systematic review on co-speech gesture generation methods for iconic, metaphoric, deictic or beat gestures, including their evaluation methods. We reviewed 22 studies that had an ECA with a human-like upper body that used co-speech gesturing in a social human-agent interaction, including a user study to evaluate its performance. We found most studies used a within-subject design and relied on a form of subjective evaluation, but lacked a systematic approach. Overall, methodological quality was low-to-moderate and few systematic conclusions could be drawn. We argue that the field requires rigorous and uniform tools for the evaluation of co-speech gesture systems. We have proposed recommendations for future empirical evaluation, including standardised phrases and test scenarios to test generative models. We have proposed a research checklist that can be used to report relevant information for the evaluation of generative models as well as to evaluate co-speech gesture use.Comment: 9 page

    A large, crowdsourced evaluation of gesture generation systems on common data : the GENEA Challenge 2020

    Get PDF
    Co-speech gestures, gestures that accompany speech, play an important role in human communication. Automatic co-speech gesture generation is thus a key enabling technology for embodied conversational agents (ECAs), since humans expect ECAs to be capable of multi-modal communication. Research into gesture generation is rapidly gravitating towards data-driven methods. Unfortunately, individual research efforts in the field are difficult to compare: there are no established benchmarks, and each study tends to use its own dataset, motion visualisation, and evaluation methodology. To address this situation, we launched the GENEA Challenge, a gesture-generation challenge wherein participating teams built automatic gesture-generation systems on a common dataset, and the resulting systems were evaluated in parallel in a large, crowdsourced user study using the same motion-rendering pipeline. Since differences in evaluation outcomes between systems now are solely attributable to differences between the motion-generation methods, this enables benchmarking recent approaches against one another in order to get a better impression of the state of the art in the field. This paper reports on the purpose, design, results, and implications of our challenge.Part of Proceedings: ISBN 978-145038017-1QC 20210607</p

    Semi-automation of gesture annotation by machine learning and human collaboration

    Get PDF
    none6siGesture and multimodal communication researchers typically annotate video data manually, even though this can be a very time-consuming task. In the present work, a method to detect gestures is proposed as a fundamental step towards a semi-automatic gesture annotation tool. The proposed method can be applied to RGB videos and requires annotations of part of a video as input. The technique deploys a pose estimation method and active learning. In the experiment, it is shown that if about 27% of the video is annotated, the remaining parts of the video can be annotated automatically with an F-score of at least 0.85. Users can run this tool with a small number of annotations first. If the predicted annotations for the remainder of the video are not satisfactory, users can add further annotations and run the tool again. The code has been released so that other researchers and practitioners can use the results of this research. This tool has been confirmed to work in conjunction with ELAN.openIenaga, Naoto; Cravotta, Alice; Terayama, Kei; Scotney, Bryan W.; Saito, Hideo; Busà, M. GraziaIenaga, Naoto; Cravotta, Alice; Terayama, Kei; Scotney, Bryan W.; Saito, Hideo; Busà, M. Grazi

    A Weft Knit Data Glove

    Get PDF
    Rehabilitation of stoke survivors can be expedited by employing an exoskeleton. The exercises are designed such that both hands move in synergy. In this regard often motion capture data from the healthy hand is used to derive control behaviour for the exoskeleton. Therefore, data gloves can provide a low-cost solution for the motion capture of the joints in the hand. However, current data gloves are bulky, inaccurate or inconsistent. These disadvantages are inherited because the conventional design of a glove involves an external attachment that degrades overtime and causes inaccuracies. This paper presents a weft knit data glove whose sensors and support structure are manufactured in the same fabrication process thus removing the need for an external attachment. The glove is made by knitting multifilament conductive yarn and an elastomeric yarn using WholeGarment technology. Furthermore, we present a detailed electromechanical model of the sensors alongside its experimental validation. Additionally, the reliability of the glove is verified experimentally. Lastly, machine learning algorithms are implemented for classifying the posture of hand on the basis of sensor data histograms
    corecore