514 research outputs found

    Immersive 360° video for forensic education

    Get PDF
    Throughout the globe, training in the investigation of forensic crime scene work is a vital part of the overall training process within Police Academies and forensic programs throughout the world. However, the exposure of trainee forensic officers to real life scenes, by instructors, is minimal due to the delicate nature of information presented within them and the overall difficulty of Forensic investigations. Virtual Reality (VR) is computer technology utilising headsets, to produce lifelike imageries, sounds and perceptions simulating physical presence inside a virtual setting to a user. The user is able to look around the virtual world and often interact with virtual landscapes or objects. VR headsets are head‐mounted goggles with a screen in front of the eyes (Burdea & Coffet 2003). The use of VR varies widely from personal gaming to classroom learning. Uses also include computerised tools that are used solely online. The current use of VR within Forensic Science is that it is used widely in several capacities that include the training and examination of new forensic officers. However, there is minimal review and authentication of the efficiency of VR use for the teaching of forensic investigation. This is surprising, as the VR field has experienced rapid expansion in the educating of many varying fields over the past few years. Even though VR could enhance forensic training by offering another, perhaps more versatile, engaging way of learning, no devoted VR application has yet been commercially implemented for forensic examination education. Research into VR is a fairly young field, however the technology and use of it is still rapidly growing and the improvement of interactive tools is inevitably having an impact on all facets of learning and teaching

    A Haptic Study to Inclusively Aid Teaching and Learning in the Discipline of Design

    Get PDF
    Designers are known to use a blend of manual and virtual processes to produce design prototype solutions. For modern designers, computer-aided design (CAD) tools are an essential requirement to begin to develop design concept solutions. CAD, together with augmented reality (AR) systems have altered the face of design practice, as witnessed by the way a designer can now change a 3D concept shape, form, color, pattern, and texture of a product by the click of a button in minutes, rather than the classic approach to labor on a physical model in the studio for hours. However, often CAD can limit a designer’s experience of being ‘hands-on’ with materials and processes. The rise of machine haptic1 (MH) tools have afforded a great potential for designers to feel more ‘hands-on’ with the virtual modeling processes. Through the use of MH, product designers are able to control, virtually sculpt, and manipulate virtual 3D objects on-screen. Design practitioners are well placed to make use of haptics, to augment 3D concept creation which is traditionally a highly tactile process. For similar reasoning, it could also be said that, non-sighted and visually impaired (NS, VI) communities could also benefit from using MH tools to increase touch-based interactions, thereby creating better access for NS, VI designers. In spite of this the use of MH within the design industry (specifically product design), or for use by the non-sighted community is still in its infancy. Therefore the full benefit of haptics to aid non-sighted designers has not yet been fully realised. This thesis empirically investigates the use of multimodal MH as a step closer to improving the virtual hands-on process, for the benefit of NS, VI and fully sighted (FS) Designer-Makers. This thesis comprises four experiments, embedded within four case studies (CS1-4). Case study 1and2 worked with self-employed NS, VI Art Makers at Henshaws College for the Blind and Visual Impaired. The study examined the effects of haptics on NS, VI users, evaluations of experience. Case study 3 and4, featuring experiments 3 and4, have been designed to examine the effects of haptics on distance learning design students at the Open University. The empirical results from all four case studies showed that NS, VI users were able to navigate and perceive virtual objects via the force from the haptically rendered objects on-screen. Moreover, they were assisted by the whole multimodal MH assistance, which in CS2 appeared to offer better assistance to NS versus FS participants. In CS3 and 4 MH and multimodal assistance afforded equal assistance to NS, VI, and FS, but haptics were not as successful in bettering the time results recorded in manual (M) haptic conditions. However, the collision data between M and MH showed little statistical difference. The thesis showed that multimodal MH systems, specifically used in kinesthetic mode have enabled human (non-disabled and disabled) to credibly judge objects within the virtual realm. It also shows that multimodal augmented tooling can improve the interaction and afford better access to the graphical user interface for a wider body of users

    Enhancing the E-Commerce Experience through Haptic Feedback Interaction

    Get PDF
    The sense of touch is important in our everyday lives and its absence makes it difficult to explore and manipulate everyday objects. Existing online shopping practice lacks the opportunity for physical evaluation, that people often use and value when making product choices. However, with recent advances in haptic research and technology, it is possible to simulate various physical properties such as heaviness, softness, deformation, and temperature. The research described here investigates the use of haptic feedback interaction to enhance e-commerce product evaluation, particularly haptic weight and texture evaluation. While other properties are equally important, besides being fundamental to the shopping experience of many online products, weight and texture can be simulated using cost-effective devices. Two initial psychophysical experiments were conducted using free motion haptic exploration in order to more closely resemble conventional shopping. One experiment was to measure weight force thresholds and another to measure texture force thresholds. The measurements can provide better understanding of haptic device limitation for online shopping in terms of the availability of different stimuli to represent physical products. The outcomes of the initial psychophysical experimental studies were then used to produce various absolute stimuli that were used in a comparative experimental study to evaluate user experience of haptic product evaluation. Although free haptic exploration was exercised on both psychophysical experiments, results were relatively consistent with previous work on haptic discrimination. The threshold for weight force discrimination represented as downward forces was 10 percent. The threshold for texture force discrimination represented as friction forces was 14.1 percent, when using dynamic coefficient of friction at any level of static coefficient of friction. On the other hand, the comparative experimental study to evaluate user experience of haptic product information indicated that haptic product evaluation does not change user performance significantly. However, although there was an increase in the time taken to complete the task, the number of button click actions tended to decrease. The results showed that haptic product evaluation could significantly increase the confidence of shopping decision. Nevertheless, the availability of haptic product evaluation does not necessarily impose different product choices but it complements other selection criteria such as price and appearance. The research findings from this work are a first step towards exploring haptic-based environments in e-commerce environments. The findings not only lay the foundation for designing online haptic shopping but also provide empirical support to research in this direction

    Smart Computing and Sensing Technologies for Animal Welfare: A Systematic Review

    Get PDF
    Animals play a profoundly important and intricate role in our lives today. Dogs have been human companions for thousands of years, but they now work closely with us to assist the disabled, and in combat and search and rescue situations. Farm animals are a critical part of the global food supply chain, and there is increasing consumer interest in organically fed and humanely raised livestock, and how it impacts our health and environmental footprint. Wild animals are threatened with extinction by human induced factors, and shrinking and compromised habitat. This review sets the goal to systematically survey the existing literature in smart computing and sensing technologies for domestic, farm and wild animal welfare. We use the notion of \emph{animal welfare} in broad terms, to review the technologies for assessing whether animals are healthy, free of pain and suffering, and also positively stimulated in their environment. Also the notion of \emph{smart computing and sensing} is used in broad terms, to refer to computing and sensing systems that are not isolated but interconnected with communication networks, and capable of remote data collection, processing, exchange and analysis. We review smart technologies for domestic animals, indoor and outdoor animal farming, as well as animals in the wild and zoos. The findings of this review are expected to motivate future research and contribute to data, information and communication management as well as policy for animal welfare

    Multimodal feedback cues on manual lifting in virtual environments

    Get PDF
    Improper manipulation of real-world objects increases the risk of developing work- related back injuries. In an effort to reduce such a risk and encourage appropriate lifting and moving methods, a Virtual Environment (VE) was employed. Virtual simulations can be used for ergonomic analysis. In this work, the VEs made use of multiple feedback techniques to allow a person to estimate the forces acting on their lower back. A person's head and hand movements were tracked in real-time whilst manipulating an object. A NIOSH lifting equation was used to calculate and determine the Lifting Index whereby the results were conveyed in real time. Visual display feedback techniques were designed and the effect of cues to enhance user performance was experimentally evaluated. The feedback cues provide the user with information about the forces acting on their lower back as they perform manual lifting tasks in VEs. Four different methods were compared and contrasted: No Feedback, Text, Colour and Combined Colour and Text. This work also investigated various types of auditory feedback technique to support object manipulation in VEs. Auditory feedback has been demonstrated to convey information in computer applications effectively, but little work has been reported on the efficacy of such techniques, particularly for ergonomic design. Four different methods were compared and contrasted: No Feedback, White-noise, Pitch and Tempo. A combined Audio-Visual (AV) technique was also examined by mixing both senses. The effect of Tactile Augmentation was also examined. Three different weights (real) were used and the results obtained by experiment were compared with the experiment using virtual weights in order to evaluate whether or not the presence of a real weighted object enhanced people's sense of realism. The goals of this study were to explore various senses of feedback technique (visual, auditory and tactile), compare the performance characteristics of each technique and understand their relative advantages and drawbacks.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Virtual Physiology: A Tool for the 21st Century

    Get PDF
    Veterinary physiology is a basic curricular unit for every course within the veterinary field. It is mandatory to understand how the animal body works, and what to expect of a healthy body, in order to recognize any misfunction, and to be able to treat it. Classic physiology teaching involves wet labs, much equipment, many reagents, some animals, and a lot of time. But times are changing. In the 21st century, it is expected that the teaching and learning process can be more active and attractive, motivating students to learn better. It is necessary to understand what students like, and to introduce novelties into the school routine. The use of a game-based learning, using “new” technologies, creating virtual experiences and labs, reducing the costs of reagents, equipment, and especially reducing the use of animals, will be the future for physiology teaching

    Multimodal feedback cues on manual lifting in virtual environments

    Get PDF
    Improper manipulation of real-world objects increases the risk of developing work- related back injuries. In an effort to reduce such a risk and encourage appropriate lifting and moving methods, a Virtual Environment (VE) was employed. Virtual simulations can be used for ergonomic analysis. In this work, the VEs made use of multiple feedback techniques to allow a person to estimate the forces acting on their lower back. A person's head and hand movements were tracked in real-time whilst manipulating an object. A NIOSH lifting equation was used to calculate and determine the Lifting Index whereby the results were conveyed in real time. Visual display feedback techniques were designed and the effect of cues to enhance user performance was experimentally evaluated. The feedback cues provide the user with information about the forces acting on their lower back as they perform manual lifting tasks in VEs. Four different methods were compared and contrasted: No Feedback, Text, Colour and Combined Colour and Text. This work also investigated various types of auditory feedback technique to support object manipulation in VEs. Auditory feedback has been demonstrated to convey information in computer applications effectively, but little work has been reported on the efficacy of such techniques, particularly for ergonomic design. Four different methods were compared and contrasted: No Feedback, White-noise, Pitch and Tempo. A combined Audio-Visual (AV) technique was also examined by mixing both senses. The effect of Tactile Augmentation was also examined. Three different weights (real) were used and the results obtained by experiment were compared with the experiment using virtual weights in order to evaluate whether or not the presence of a real weighted object enhanced people's sense of realism. The goals of this study were to explore various senses of feedback technique (visual, auditory and tactile), compare the performance characteristics of each technique and understand their relative advantages and drawbacks

    On the critical role of the sensorimotor loop on the design of interaction techniques and interactive devices

    Get PDF
    People interact with their environment thanks to their perceptual and motor skills. This is the way they both use objects around them and perceive the world around them. Interactive systems are examples of such objects. Therefore to design such objects, we must understand how people perceive them and manipulate them. For example, haptics is both related to the human sense of touch and what I call the motor ability. I address a number of research questions related to the design and implementation of haptic, gestural, and touch interfaces and present examples of contributions on these topics. More interestingly, perception, cognition, and action are not separated processes, but an integrated combination of them called the sensorimotor loop. Interactive systems follow the same overall scheme, with differences that make the complementarity of humans and machines. The interaction phenomenon is a set of connections between human sensorimotor loops, and interactive systems execution loops. It connects inputs with outputs, users and systems, and the physical world with cognition and computing in what I call the Human-System loop. This model provides a complete overview of the interaction phenomenon. It helps to identify the limiting factors of interaction that we can address to improve the design of interaction techniques and interactive devices.Les humains interagissent avec leur environnement grâce à leurs capacités perceptives et motrices. C'est ainsi qu'ils utilisent les objets qui les entourent et perçoivent le monde autour d'eux. Les systèmes interactifs sont des exemples de tels objets. Par conséquent, pour concevoir de tels objets, nous devons comprendre comment les gens les perçoivent et les manipulent. Par exemple, l'haptique est à la fois liée au sens du toucher et à ce que j'appelle la capacité motrice. J'aborde un certain nombre de questions de recherche liées à la conception et à la mise en œuvre d'interfaces haptiques, gestuelles et tactiles et je présente des exemples de contributions sur ces sujets. Plus intéressant encore, la perception, la cognition et l'action ne sont pas des processus séparés, mais une combinaison intégrée d'entre eux appelée la boucle sensorimotrice. Les systèmes interactifs suivent le même schéma global, avec des différences qui forme la complémentarité des humains et des machines. Le phénomène d'interaction est un ensemble de connexions entre les boucles sensorimotrices humaines et les boucles d'exécution des systèmes interactifs. Il relie les entrées aux sorties, les utilisateurs aux systèmes, et le monde physique à la cognition et au calcul dans ce que j'appelle la boucle Humain-Système. Ce modèle fournit un aperçu complet du phénomène d'interaction. Il permet d'identifier les facteurs limitatifs de l'interaction que nous pouvons aborder pour améliorer la conception des techniques d'interaction et des dispositifs interactifs
    corecore