305 research outputs found

    Multisensory Approaches From Interactive Art to Inclusive Design

    Get PDF
    In interactive art and multimedia installations, the public plays a fundamental part. Visitors change the meaning and the appearance of artwork according to their sensitivity and preferred way of interaction. For designers, this audience is the set of users on which they should focus their projects. Among the most pervasive technologies are a variety of solutions for interacting with the environment, activated by gesture and movement sensors, voice interfaces,.. and a range of ways of enabling people with different abilities. Many of these technologies were born to be integrated into disability devices or are often used to allow access to the usage of an artifact by people with different kinds of impairments. There are many examples of how solutions designed for specific niches have over time been integrated into common use in private and public areas, recreational and cultural spaces. Through an analysis of the process that has given rise to this, it is possible to understand when and how designers should intervene in the creation of their projects to ensure the accessibility and usability of the resulting artifacts. In the empathizing and ideating design phases, it seems necessary to consider the various multisensory modes of interaction to guarantee the usability and scalability of the project. In this way, the outcome may become truly inclusive and accessible, but also a benchmark for human-centered design, starting from specific needs and incorporating them into everyday use to integrate small groups and minorities, not creating projects and devices that separate and divide them

    Analyzing Interaction for Automated Adaptation – First Steps in the IAAA Project

    Get PDF
    Because of an aging society and the relevance of computer-based systems in a variety of fields of our life, personalization of software systems is becoming more important by the day in order to prevent usage errors and create a good user experience. However, personalization typically is a time-consuming and costly process if it is done through manual configuration. Automated adaptation to specific users’ needs is, therefore, a useful way to reduce the efforts necessary. The IAAA project focuses on the analysis of user interaction capabilities and the implementation of automated adaptations based on them. However, the success of these endeavors is strongly reliant on a careful selection of interaction modalities as well as profound knowledge of the target group’s general interaction behavior. Therefore, as a first step in the project, an extensive task-based user observation with thorough involvement of the actual target group was conducted in order to determine input devices and modalities that would in a second step become subject of the first prototypic implementations. This paper discusses the general objectives of the IAAA project, describes the methodology and aims behind the user observation and presents its results

    Systematic literature review of hand gestures used in human computer interaction interfaces

    Get PDF
    Gestures, widely accepted as a humans' natural mode of interaction with their surroundings, have been considered for use in human-computer based interfaces since the early 1980s. They have been explored and implemented, with a range of success and maturity levels, in a variety of fields, facilitated by a multitude of technologies. Underpinning gesture theory however focuses on gestures performed simultaneously with speech, and majority of gesture based interfaces are supported by other modes of interaction. This article reports the results of a systematic review undertaken to identify characteristics of touchless/in-air hand gestures used in interaction interfaces. 148 articles were reviewed reporting on gesture-based interaction interfaces, identified through searching engineering and science databases (Engineering Village, Pro Quest, Science Direct, Scopus and Web of Science). The goal of the review was to map the field of gesture-based interfaces, investigate the patterns in gesture use, and identify common combinations of gestures for different combinations of applications and technologies. From the review, the community seems disparate with little evidence of building upon prior work and a fundamental framework of gesture-based interaction is not evident. However, the findings can help inform future developments and provide valuable information about the benefits and drawbacks of different approaches. It was further found that the nature and appropriateness of gestures used was not a primary factor in gesture elicitation when designing gesture based systems, and that ease of technology implementation often took precedence

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    Investigating Real-time Touchless Hand Interaction and Machine Learning Agents in Immersive Learning Environments

    Get PDF
    The recent surge in the adoption of new technologies and innovations in connectivity, interaction technology, and artificial realities can fundamentally change the digital world. eXtended Reality (XR), with its potential to bridge the virtual and real environments, creates new possibilities to develop more engaging and productive learning experiences. Evidence is emerging that thissophisticated technology offers new ways to improve the learning process for better student interaction and engagement. Recently, immersive technology has garnered much attention as an interactive technology that facilitates direct interaction with virtual objects in the real world. Furthermore, these virtual objects can be surrogates for real-world teaching resources, allowing for virtual labs. Thus XR could enable learning experiences that would not bepossible in impoverished educational systems worldwide. Interestingly, concepts such as virtual hand interaction and techniques such as machine learning are still not widely investigated in immersive learning. Hand interaction technologies in virtual environments can support the kinesthetic learning pedagogical approach, and the need for its touchless interaction nature hasincreased exceptionally in the post-COVID world. By implementing and evaluating real-time hand interaction technology for kinesthetic learning and machine learning agents for self-guided learning, this research has addressed these underutilized technologies to demonstrate the efficiency of immersive learning. This thesis has explored different hand-tracking APIs and devices to integrate real-time hand interaction techniques. These hand interaction techniques and integrated machine learning agents using reinforcement learning are evaluated with different display devices to test compatibility. The proposed approach aims to provide self-guided, more productive, and interactive learning experiences. Further, this research has investigated ethics, privacy, and security issues in XR and covered the future of immersive learning in the Metaverse.<br/

    Proceedings of the 4th Workshop on Interacting with Smart Objects 2015

    Get PDF
    These are the Proceedings of the 4th IUI Workshop on Interacting with Smart Objects. Objects that we use in our everyday life are expanding their restricted interaction capabilities and provide functionalities that go far beyond their original functionality. They feature computing capabilities and are thus able to capture information, process and store it and interact with their environments, turning them into smart objects

    User-based gesture vocabulary for form creation during a product design process

    Get PDF
    There are inconsistencies between the nature of the conceptual design and the functionalities of the computational systems supporting it, which disrupt the designers’ process, focusing on technology rather than designers’ needs. A need for elicitation of hand gestures appropriate for the requirements of the conceptual design, rather than those arbitrarily chosen or focusing on ease of implementation was identified.The aim of this thesis is to identify natural and intuitive hand gestures for conceptual design, performed by designers (3rd, 4th year product design engineering students and recent graduates) working on their own, without instruction and without limitations imposed by the facilitating technology. This was done via a user centred study including 44 participants. 1785 gestures were collected. Gestures were explored as a sole mean for shape creation and manipulation in virtual 3D space. Gestures were identified, described in writing, sketched, coded based on the taxonomy used, categorised based on hand form and the path travelled and variants identified. Then they were statistically analysed to ascertain agreement rates between the participants, significance of the agreement and the likelihood of number of repetitions for each category occurring by chance. The most frequently used and statistically significant gestures formed the consensus set of vocabulary for conceptual design. The effect of the shape of the manipulated object on the gesture performed, and if the sequence of the gestures participants proposed was different from the established CAD solid modelling practices were also observed.Vocabulary was evaluated by non-designer participants, and the outcomes have shown that the majority of gestures were appropriate and easy to perform. Evaluation was performed theoretically and in the VR environment. Participants selected their preferred gestures for each activity, and a variant of the vocabulary for conceptual design was created as an outcome, that aims to ensure that extensive training is not required, extending the ability to design beyond trained designers only.There are inconsistencies between the nature of the conceptual design and the functionalities of the computational systems supporting it, which disrupt the designers’ process, focusing on technology rather than designers’ needs. A need for elicitation of hand gestures appropriate for the requirements of the conceptual design, rather than those arbitrarily chosen or focusing on ease of implementation was identified.The aim of this thesis is to identify natural and intuitive hand gestures for conceptual design, performed by designers (3rd, 4th year product design engineering students and recent graduates) working on their own, without instruction and without limitations imposed by the facilitating technology. This was done via a user centred study including 44 participants. 1785 gestures were collected. Gestures were explored as a sole mean for shape creation and manipulation in virtual 3D space. Gestures were identified, described in writing, sketched, coded based on the taxonomy used, categorised based on hand form and the path travelled and variants identified. Then they were statistically analysed to ascertain agreement rates between the participants, significance of the agreement and the likelihood of number of repetitions for each category occurring by chance. The most frequently used and statistically significant gestures formed the consensus set of vocabulary for conceptual design. The effect of the shape of the manipulated object on the gesture performed, and if the sequence of the gestures participants proposed was different from the established CAD solid modelling practices were also observed.Vocabulary was evaluated by non-designer participants, and the outcomes have shown that the majority of gestures were appropriate and easy to perform. Evaluation was performed theoretically and in the VR environment. Participants selected their preferred gestures for each activity, and a variant of the vocabulary for conceptual design was created as an outcome, that aims to ensure that extensive training is not required, extending the ability to design beyond trained designers only

    Novel active sweat pores based liveness detection techniques for fingerprint biometrics

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.Liveness detection in automatic fingerprint identification systems (AFIS) is an issue which still prevents its use in many unsupervised security applications. In the last decade, various hardware and software solutions for the detection of liveness from fingerprints have been proposed by academic research groups. However, the proposed methods have not yet been practically implemented with existing AFIS. A large amount of research is needed before commercial AFIS can be implemented. In this research, novel active pore based liveness detection methods were proposed for AFIS. These novel methods are based on the detection of active pores on fingertip ridges, and the measurement of ionic activity in the sweat fluid that appears at the openings of active pores. The literature is critically reviewed in terms of liveness detection issues. Existing fingerprint technology, and hardware and software solutions proposed for liveness detection are also examined. A comparative study has been completed on the commercially and specifically collected fingerprint databases, and it was concluded that images in these datasets do not contained any visible evidence of liveness. They were used to test various algorithms developed for liveness detection; however, to implement proper liveness detection in fingerprint systems a new database with fine details of fingertips is needed. Therefore a new high resolution Brunel Fingerprint Biometric Database (B-FBDB) was captured and collected for this novel liveness detection research. The first proposed novel liveness detection method is a High Pass Correlation Filtering Algorithm (HCFA). This image processing algorithm has been developed in Matlab and tested on B-FBDB dataset images. The results of the HCFA algorithm have proved the idea behind the research, as they successfully demonstrated the clear possibility of liveness detection by active pore detection from high resolution images. The second novel liveness detection method is based on the experimental evidence. This method explains liveness detection by measuring the ionic activities above the sample of ionic sweat fluid. A Micro Needle Electrode (MNE) based setup was used in this experiment to measure the ionic activities. In results, 5.9 pC to 6.5 pC charges were detected with ten NME positions (50ÎŒm to 360 ÎŒm) above the surface of ionic sweat fluid. These measurements are also a proof of liveness from active fingertip pores, and this technique can be used in the future to implement liveness detection solutions. The interaction of NME and ionic fluid was modelled in COMSOL multiphysics, and the effect of electric field variations on NME was recorded at 5ÎŒm -360ÎŒm positions above the ionic fluid.This study is funded by the University of Sindh, Jamshoro, Pakistan and the Higher Education Commission of Pakistan

    Workshop #3: Transportation and Sheltering Logistics During the 2020 Hurricane Season: After-Action Report (AAR)

    Get PDF
    Participants in the CONVERGE COVID-19 Working Group’s Workshop 3 on Logistics breakout sessions identified key issues that included population considerations, training needs, continuity of operations and resources still available, site planning (i.e., feeding, registration, shelter design, resources, family unity), facility requirements, and supplies needed during a hurricane evacuation. Operational safety measures and population considerations were mentioned throughout the workshop. This included the need for identifying additional resources, facilities, and staffing to be able to ensure safety is a priority while accommodating social distancing recommendations and the needs of vulnerable populations and staff. Workshop participants emphasized the need to identify new partnerships for critical services and alternative site locations for sheltering to increase evacuation and sheltering capacity. This entailed the need for reassessing existing contracts with transportation and sheltering to ensure they are still operational and have the staffing and resources to support the logistical needs for evacuating vulnerable populations and the public to shelters. In addition to identifying what is needed for logistical planning there is a need for understanding facility requirements, availability of buildings and needed supplies for operating non-congregate and congregate shelters. Infrastructure and shelter design were discussed to provide context on how jurisdictions are providing support in their shelter operations that maintains infection control measures, social distancing and keeping family units together. Staffing and training needs for both transportation and shelter operations was a concern and various ideas were proposed as solutions. Questions were raised in the workshop about designing infrastructure in the future that would allow us to have an increased number of facilities for sheltering during a hurricane, or to serve as a refuge of last resort. What does the registration process look like during this hurricane season? How do you handle the demand for personal protective equipment (PPE) and other supplies needed to reduce the risk of contracting COVID-19? A few participants highlighted the need for logistical procedures and guidelines for pet evacuation and pet-friendly shelters along with service animals
    • 

    corecore