2,381 research outputs found
Recommended from our members
Stars in their eyes: What eye-tracking reveal about multimedia perceptual quality
Perceptual multimedia quality is of paramount
importance to the continued take-up and proliferation of multimedia applications: users will not use and pay for applications if they are perceived to be of low quality. Whilst traditionally distributed multimedia quality has been characterised by Quality of Service (QoS) parameters, these neglect the user perspective of the issue of quality. In order to redress this shortcoming, we characterise the user multimedia perspective using the Quality of Perception (QoP) metric, which encompasses not only a user’s satisfaction with the quality of a multimedia presentation, but also his/her ability to analyse,
synthesise and assimilate informational content of multimedia. In recognition of the fact that monitoring eye movements offers insights into visual perception, as well as the associated
attention mechanisms and cognitive processes, this paper reports on the results of a study investigating the impact of differing multimedia presentation frame rates on user QoP and eye path data. Our results show that provision of higher frame rates, usually assumed to provide better multimedia presentation quality, do not significantly impact upon the median coordinate value of eye path data. Moreover, higher frame rates do not significantly increase level of participant information assimilation, although they do significantly improve overall user enjoyment and quality perception of the multimedia content being shown
Virtual Reality-Assisted Physiotherapy for Visuospatial Neglect Rehabilitation: A Proof-of-Concept Study
This study explores a VR-based intervention for Visuospatial neglect (VSN), a
post-stroke condition. It aims to develop a VR task utilizing interactive
visual-audio cues to improve sensory-motor training and assess its impact on
VSN patients' engagement and performance. Collaboratively designed with
physiotherapists, the VR task uses directional and auditory stimuli to alert
and direct patients, tested over 12 sessions with two individuals. Results show
a consistent decrease in task completion variability and positive patient
feedback, highlighting the VR task's potential for enhancing engagement and
suggesting its feasibility in rehabilitation. The study underlines the
significance of collaborative design in healthcare technology and advocates for
further research with a larger sample size to confirm the benefits of VR in VSN
treatment, as well as its applicability to other multimodal disorders.Comment: 29 pages, 8 figures, 5 table
Gaze-tracking-based interface for robotic chair guidance
This research focuses on finding solutions to enhance the quality of life for wheelchair users, specifically by applying a gaze-tracking-based interface for the guidance of a robotized wheelchair. For this purpose, the interface was applied in two different approaches for the wheelchair control system. The first one was an assisted control in which the user was continuously involved in controlling the movement of the wheelchair in the environment and the inclination of the different parts of the seat through the user’s gaze and eye blinks obtained with the interface. The second approach was to take the first steps to apply the device to an autonomous wheelchair control in which the wheelchair moves autonomously avoiding collisions towards the position defined by the user. To this end, the basis for obtaining the gaze position relative to the wheelchair and the object detection was developed in this project to be able to calculate in the future the optimal route to which the wheelchair should move. In addition, the integration of a robotic arm in the wheelchair to manipulate different objects was also considered, obtaining in this work the object of interest indicated by the user's gaze within the detected objects so that in the future the robotic arm could select and pick up the object the user wants to manipulate. In addition to the two approaches, an attempt was also made to estimate the user's gaze without the software interface. For this purpose, the gaze is obtained from pupil detection libraries, a calibration and a mathematical model that relates pupil positions to gaze. The results of the implementations have been analysed in this work, including some limitations encountered. Nevertheless, future improvements are proposed, with the aim of increasing the independence of wheelchair user
Recommended from our members
Mobile robot teleoperation through eye-gaze (telegaze)
In most teleoperation applications the human operator is required to monitor the status of the robot, as well as, issue controlling commands for the whole duration of the operation. Using a vision based feedback system, monitoring the robot requires the operator to look at a continuous stream of images displayed on an interaction screen. The eyes of the operator therefore, are fully engaged in monitoring and the hands in controlling. Since the eyes of the operator are engaged in monitoring anyway, inputs from their gaze can be used to aid in controlling. This frees the hands of the operator, either partially or fully, from controlling which can then be used to perform any other necessary tasks. However, the challenge here lies in distinguishing between the inputs that can be used for controlling and the inputs that can be used for monitoring. In mobile robot teleoperation, controlling is mainly composed of issuing locomotion commands to drive the robot. Monitoring on the other hand, is looking where the robot goes and looking for any obstacles in the route. Interestingly, there exist a strong correlation between human's gazing behaviours and their moving intentions. This correlation has been exploited in this thesis to investigate novel means for mobile robot teleoperation through eye-gaze, which has been named TeleGaze for short
A Scenario Analysis of Wearable Interface Technology Foresight
Although the importance and value of wearable interface have gradually being realized, wearable interface related technologies and the priority of adopting these technologies have so far not been clearly recognized. To fill this gap, this paper focuses on the technology planning strategy of organizations that have an interest in developing or adopting wearable interface related technologies. Based on the scenario analysis approach, a technology planning strategy is proposed. In this analysis, thirty wearable interface technologies are classified into six categories, and the importance and risk factors of these categories are then evaluated under two possible scenarios. The main research findings include the discovery that most brain based wearable interface technologies are rated high to medium importance and high risk in all scenarios, and that scenario changes will have less impact on voice based as well as gesture based wearable interface technologies. These results provide a reference for organizations and vendors interested in adopting or developing wearable interface technologies
Electroencephalography (EEG), electromyography (EMG) and eye-tracking for astronaut training and space exploration
The ongoing push to send humans back to the Moon and to Mars is giving rise
to a wide range of novel technical solutions in support of prospective
astronaut expeditions. Against this backdrop, the European Space Agency (ESA)
has recently launched an investigation into unobtrusive interface technologies
as a potential answer to such challenges. Three particular technologies have
shown promise in this regard: EEG-based brain-computer interfaces (BCI) provide
a non-invasive method of utilizing recorded electrical activity of a user's
brain, electromyography (EMG) enables monitoring of electrical signals
generated by the user's muscle contractions, and finally, eye tracking enables,
for instance, the tracking of user's gaze direction via camera recordings to
convey commands. Beyond simply improving the usability of prospective technical
solutions, our findings indicate that EMG, EEG, and eye-tracking could also
serve to monitor and assess a variety of cognitive states, including attention,
cognitive load, and mental fatigue of the user, while EMG could furthermore
also be utilized to monitor the physical state of the astronaut. In this paper,
we elaborate on the key strengths and challenges of these three enabling
technologies, and in light of ESA's latest findings, we reflect on their
applicability in the context of human space flight. Furthermore, a timeline of
technological readiness is provided. In so doing, this paper feeds into the
growing discourse on emerging technology and its role in paving the way for a
human return to the Moon and expeditions beyond the Earth's orbit
Behavioural attentiveness patterns analysis – detecting distraction behaviours
The capacity of remaining focused on a task can be crucial in some circumstances. In general, this ability
is intrinsic in a human social interaction and it is naturally used in any social context. Nevertheless, some
individuals have difficulties in remaining concentrated in an activity, resulting in a short attention span.
Children with Autism Spectrum Disorder (ASD) are a special example of such individuals. ASD is a group
of complex developmental disorders of the brain. Individuals affected by this disorder are characterized
by repetitive patterns of behaviour, restricted activities or interests, and impairments in social
communication. The use of robots has already proved to encourage the developing of social interaction
skills lacking in children with ASD. However, most of these systems are controlled remotely and cannot
adapt automatically to the situation, and even those who are more autonomous still cannot perceive
whether or not the user is paying attention to the instructions and actions of the robot.
Following this trend, this dissertation is part of a research project that has been under development for
some years. In this project, the Robot ZECA (Zeno Engaging Children with Autism) from Hanson Robotics
is used to promote the interaction with children with ASD helping them to recognize emotions, and to
acquire new knowledge in order to promote social interaction and communication with the others.
The main purpose of this dissertation is to know whether the user is distracted during an activity. In the
future, the objective is to interface this system with ZECA to consequently adapt its behaviour taking into
account the individual affective state during an emotion imitation activity. In order to recognize human
distraction behaviours and capture the user attention, several patterns of distraction, as well as systems
to automatically detect them, have been developed. One of the most used distraction patterns detection
methods is based on the measurement of the head pose and eye gaze. The present dissertation proposes
a system based on a Red Green Blue (RGB) camera, capable of detecting the distraction patterns, head
pose, eye gaze, blinks frequency, and the user to position towards the camera, during an activity, and
then classify the user's state using a machine learning algorithm.
Finally, the proposed system is evaluated in a laboratorial and controlled environment in order to verify if
it is capable to detect the patterns of distraction. The results of these preliminary tests allowed to detect
some system constraints, as well as to validate its adequacy to later use it in an intervention setting.A capacidade de permanecer focado numa tarefa pode ser crucial em algumas circunstâncias. No geral,
essa capacidade é intrínseca numa interação social humana e é naturalmente usada em qualquer
contexto social. No entanto, alguns indivíduos têm dificuldades em permanecer concentrados numa
atividade, resultando num curto período de atenção. Crianças com Perturbações do Espectro do Autismo
(PEA) são um exemplo especial de tais indivíduos. PEA é um grupo de perturbações complexas do
desenvolvimento do cérebro. Os indivíduos afetados por estas perturbações são caracterizados por
padrões repetitivos de comportamento, atividades ou interesses restritos e deficiências na comunicação
social. O uso de robôs já provaram encorajar a promoção da interação social e ajudaram no
desenvolvimento de competências deficitárias nas crianças com PEA. No entanto, a maioria desses
sistemas é controlada remotamente e não consegue-se adaptar automaticamente à situação, e mesmo
aqueles que são mais autônomos ainda não conseguem perceber se o utilizador está ou não atento às
instruções e ações do robô. Seguindo esta tendência, esta dissertação é parte de um projeto de pesquisa
que vem sendo desenvolvido há alguns anos, onde o robô ZECA (Zeno Envolvendo Crianças com Autismo)
da Hanson Robotics é usado para promover a interação com crianças com PEA, ajudando-as a
reconhecer emoções, adquirir novos conhecimentos para promover a interação social e comunicação
com os pares. O principal objetivo desta dissertação é saber se o utilizador está distraído durante uma
atividade. No futuro, o objetivo é fazer a interface deste sistema com o ZECA para, consequentemente,
adaptar o seu comportamento tendo em conta o estado afetivo do utilizador durante uma atividade de
imitação de emoções. A fim de reconhecer os comportamentos de distração humana e captar a atenção
do utilizador, vários padrões de distração, bem como sistemas para detetá-los automaticamente, foram
desenvolvidos. Um dos métodos de deteção de padrões de distração mais utilizados baseia-se na
medição da orientação da cabeça e da orientação do olhar. A presente dissertação propõe um sistema
baseado numa câmera Red Green Blue (RGB), capaz de detetar os padrões de distração, orientação da
cabeça, orientação do olhar, frequência do piscar de olhos e a posição do utilizador em frente da câmera,
durante uma atividade, e então classificar o estado do utilizador usando um algoritmo de “machine
learning”. Por fim, o sistema proposto é avaliado num ambiente laboratorial, a fim de verificar se é capaz
de detetar os padrões de distração. Os resultados destes testes preliminares permitiram detetar algumas
restrições do sistema, bem como validar a sua adequação para posteriormente utilizá-lo num ambiente
de intervenção
- …