101,087 research outputs found

    GUI system for Elders/Patients in Intensive Care

    Full text link
    In the old age, few people need special care if they are suffering from specific diseases as they can get stroke while they are in normal life routine. Also patients of any age, who are not able to walk, need to be taken care of personally but for this, either they have to be in hospital or someone like nurse should be with them for better care. This is costly in terms of money and man power. A person is needed for 24x7 care of these people. To help in this aspect we purposes a vision based system which will take input from the patient and will provide information to the specified person, who is currently may not in the patient room. This will reduce the need of man power, also a continuous monitoring would not be needed. The system is using MS Kinect for gesture detection for better accuracy and this system can be installed at home or hospital easily. The system provides GUI for simple usage and gives visual and audio feedback to user. This system work on natural hand interaction and need no training before using and also no need to wear any glove or color strip.Comment: In proceedings of the 4th IEEE International Conference on International Technology Management Conference, Chicago, IL USA, 12-15 June, 201

    A Wearable Textile 3D Gesture Recognition Sensor Based on Screen-Printing Technology

    Full text link
    [EN] Research has developed various solutions in order for computers to recognize hand gestures in the context of human machine interface (HMI). The design of a successful hand gesture recognition system must address functionality and usability. The gesture recognition market has evolved from touchpads to touchless sensors, which do not need direct contact. Their application in textiles ranges from the field of medical environments to smart home applications and the automotive industry. In this paper, a textile capacitive touchless sensor has been developed by using screen-printing technology. Two different designs were developed to obtain the best configuration, obtaining good results in both cases. Finally, as a real application, a complete solution of the sensor with wireless communications is presented to be used as an interface for a mobile phone.The work presented is funded by the Conselleria d'Economia Sostenible, Sectors Productius i Treball, through IVACE (Instituto Valenciano de Competitividad Empresarial) and cofounded by ERDF funding from the EU. Application No.: IMAMCI/2019/1. This work was also supported by the Spanish Government/FEDER funds (RTI2018-100910-B-C43) (MINECO/FEDER).Ferri Pascual, J.; Llinares Llopis, R.; Moreno Canton, J.; Ibåñez Civera, FJ.; Garcia-Breijo, E. (2019). A Wearable Textile 3D Gesture Recognition Sensor Based on Screen-Printing Technology. Sensors. 19(23):1-32. https://doi.org/10.3390/s19235068S1321923Chakraborty, B. K., Sarma, D., Bhuyan, M. K., & MacDorman, K. F. (2017). Review of constraints on vision‐based gesture recognition for human–computer interaction. IET Computer Vision, 12(1), 3-15. doi:10.1049/iet-cvi.2017.0052Zhang, Z. (2012). Microsoft Kinect Sensor and Its Effect. IEEE Multimedia, 19(2), 4-10. doi:10.1109/mmul.2012.24Rautaray, S. S. (2012). Real Time Hand Gesture Recognition System for Dynamic Applications. International Journal of UbiComp, 3(1), 21-31. doi:10.5121/iju.2012.3103Karim, R. A., Zakaria, N. F., Zulkifley, M. A., Mustafa, M. M., Sagap, I., & Md Latar, N. H. (2013). Telepointer technology in telemedicine : a review. BioMedical Engineering OnLine, 12(1), 21. doi:10.1186/1475-925x-12-21Santos, L., Carbonaro, N., Tognetti, A., GonzĂĄlez, J., de la Fuente, E., Fraile, J., & PĂ©rez-Turiel, J. (2018). Dynamic Gesture Recognition Using a Smart Glove in Hand-Assisted Laparoscopic Surgery. Technologies, 6(1), 8. doi:10.3390/technologies6010008Singh, A., Buonassisi, J., & Jain, S. (2014). Autonomous Multiple Gesture Recognition System for Disabled People. International Journal of Image, Graphics and Signal Processing, 6(2), 39-45. doi:10.5815/ijigsp.2014.02.05Ohn-Bar, E., & Trivedi, M. M. (2014). Hand Gesture Recognition in Real Time for Automotive Interfaces: A Multimodal Vision-Based Approach and Evaluations. IEEE Transactions on Intelligent Transportation Systems, 15(6), 2368-2377. doi:10.1109/tits.2014.2337331Khan, S. A., & Engelbrecht, A. P. (2010). A fuzzy particle swarm optimization algorithm for computer communication network topology design. Applied Intelligence, 36(1), 161-177. doi:10.1007/s10489-010-0251-2Abraham, L., Urru, A., Normani, N., Wilk, M., Walsh, M., & O’Flynn, B. (2018). Hand Tracking and Gesture Recognition Using Lensless Smart Sensors. Sensors, 18(9), 2834. doi:10.3390/s18092834Zeng, Q., Kuang, Z., Wu, S., & Yang, J. (2019). A Method of Ultrasonic Finger Gesture Recognition Based on the Micro-Doppler Effect. Applied Sciences, 9(11), 2314. doi:10.3390/app9112314Lien, J., Gillian, N., Karagozler, M. E., Amihood, P., Schwesig, C., Olson, E., 
 Poupyrev, I. (2016). Soli. ACM Transactions on Graphics, 35(4), 1-19. doi:10.1145/2897824.2925953Sang, Y., Shi, L., & Liu, Y. (2018). Micro Hand Gesture Recognition System Using Ultrasonic Active Sensing. IEEE Access, 6, 49339-49347. doi:10.1109/access.2018.2868268Ferri, J., LidĂłn-Roger, J., Moreno, J., Martinez, G., & Garcia-Breijo, E. (2017). A Wearable Textile 2D Touchpad Sensor Based on Screen-Printing Technology. Materials, 10(12), 1450. doi:10.3390/ma10121450Nunes, J., Castro, N., Gonçalves, S., Pereira, N., Correia, V., & Lanceros-Mendez, S. (2017). Marked Object Recognition Multitouch Screen Printed Touchpad for Interactive Applications. Sensors, 17(12), 2786. doi:10.3390/s17122786Ferri, J., Perez Fuster, C., Llinares Llopis, R., Moreno, J., & Garcia‑Breijo, E. (2018). Integration of a 2D Touch Sensor with an Electroluminescent Display by Using a Screen-Printing Technology on Textile Substrate. Sensors, 18(10), 3313. doi:10.3390/s18103313Cronin, S., & Doherty, G. (2018). Touchless computer interfaces in hospitals: A review. Health Informatics Journal, 25(4), 1325-1342. doi:10.1177/1460458217748342Haslinger, L., Wasserthal, S., & Zagar, B. G. (2017). P3.1 - A capacitive measurement system for gesture regocnition. Proceedings Sensor 2017. doi:10.5162/sensor2017/p3.1Cherenack, K., & van Pieterson, L. (2012). Smart textiles: Challenges and opportunities. Journal of Applied Physics, 112(9), 091301. doi:10.1063/1.474272

    3-D Hand Pose Estimation from Kinect's Point Cloud Using Appearance Matching

    Full text link
    We present a novel appearance-based approach for pose estimation of a human hand using the point clouds provided by the low-cost Microsoft Kinect sensor. Both the free-hand case, in which the hand is isolated from the surrounding environment, and the hand-object case, in which the different types of interactions are classified, have been considered. The hand-object case is clearly the most challenging task having to deal with multiple tracks. The approach proposed here belongs to the class of partial pose estimation where the estimated pose in a frame is used for the initialization of the next one. The pose estimation is obtained by applying a modified version of the Iterative Closest Point (ICP) algorithm to synthetic models to obtain the rigid transformation that aligns each model with respect to the input data. The proposed framework uses a "pure" point cloud as provided by the Kinect sensor without any other information such as RGB values or normal vector components. For this reason, the proposed method can also be applied to data obtained from other types of depth sensor, or RGB-D camera

    Interaction With Tilting Gestures In Ubiquitous Environments

    Full text link
    In this paper, we introduce a tilting interface that controls direction based applications in ubiquitous environments. A tilt interface is useful for situations that require remote and quick interactions or that are executed in public spaces. We explored the proposed tilting interface with different application types and classified the tilting interaction techniques. Augmenting objects with sensors can potentially address the problem of the lack of intuitive and natural input devices in ubiquitous environments. We have conducted an experiment to test the usability of the proposed tilting interface to compare it with conventional input devices and hand gestures. The experiment results showed greater improvement of the tilt gestures in comparison with hand gestures in terms of speed, accuracy, and user satisfaction.Comment: 13 pages, 10 figure

    Tangible user interfaces : past, present and future directions

    Get PDF
    In the last two decades, Tangible User Interfaces (TUIs) have emerged as a new interface type that interlinks the digital and physical worlds. Drawing upon users' knowledge and skills of interaction with the real non-digital world, TUIs show a potential to enhance the way in which people interact with and leverage digital information. However, TUI research is still in its infancy and extensive research is required in or- der to fully understand the implications of tangible user interfaces, to develop technologies that further bridge the digital and the physical, and to guide TUI design with empirical knowledge. This paper examines the existing body of work on Tangible User In- terfaces. We start by sketching the history of tangible user interfaces, examining the intellectual origins of this ïŹeld. We then present TUIs in a broader context, survey application domains, and review frame- works and taxonomies. We also discuss conceptual foundations of TUIs including perspectives from cognitive sciences, phycology, and philoso- phy. Methods and technologies for designing, building, and evaluating TUIs are also addressed. Finally, we discuss the strengths and limita- tions of TUIs and chart directions for future research
    • 

    corecore