16 research outputs found
Integrating Haptic Feedback into Mobile Location Based Services
Haptics is a feedback technology that takes advantage of the human sense of touch by
applying forces, vibrations, and/or motions to a haptic-enabled device such as a mobile
phone. Historically, human-computer interaction has been visual - text and images on
the screen. Haptic feedback can be an important additional method especially in Mobile
Location Based Services such as knowledge discovery, pedestrian navigation and notification
systems. A knowledge discovery system called the Haptic GeoWand is a low
interaction system that allows users to query geo-tagged data around them by using
a point-and-scan technique with their mobile device. Haptic Pedestrian is a navigation
system for walkers. Four prototypes have been developed classified according to
the user’s guidance requirements, the user type (based on spatial skills), and overall
system complexity. Haptic Transit is a notification system that provides spatial information
to the users of public transport. In all these systems, haptic feedback is used
to convey information about location, orientation, density and distance by use of the
vibration alarm with varying frequencies and patterns to help understand the physical
environment. Trials elicited positive responses from the users who see benefit in being
provided with a “heads up” approach to mobile navigation. Results from a memory recall
test show that the users of haptic feedback for navigation had better memory recall
of the region traversed than the users of landmark images. Haptics integrated into a
multi-modal navigation system provides more usable, less distracting but more effective
interaction than conventional systems. Enhancements to the current work could include
integration of contextual information, detailed large-scale user trials and the exploration
of using haptics within confined indoor spaces
Wearable Tools for Affective Remote Collaboration
Affective computing is the study and development of systems
that can recognize human emotions and feelings. Emotions are always an interesting topic of research and these days researchers are trying to develop systems which can recognize, interpret and process emotions based on human physiological and neural changes for the development of well-being.
As the market for wearable devices is expanding, it provides more opportunity of research in emotion sharing with remote person. This Master’s thesis investigates the possibility of using wearable devices for affective remote collaboration. Previous research about affective computing, affective communication and remote collaboration using wearable devices is reviewed before starting the design process. Three wearable devices were developed, evaluated and discussed, two for emotion sharing between remote people, and the third for preliminary research to explore if eye gaze information can increase co-presence in remote collaboration. Conclusions and Future work are discussed based on the results from the research evaluation
VR Storytelling
The question of cinematic VR production has been on the table for several years. This is due to the peculiarity of VR language which, even if it is de ned by an image that surrounds and immerses the viewer rather than placing them, as in the classic cinematic situation, in front of a screen, relies decisively on an audiovisual basis that cannot help but refer to cinematic practices of constructing visual and auditory experience. Despite this, it would be extremely reductive to consider VR as the mere transposition of elements of cinematic language. The VR medium is endowed with its own speci city, which inevitably impacts its forms of narration. We thus need to investigate the narrative forms it uses that are probably related to cinematic language, and draw their strength from the same basis, drink from the same well, but develop according to di erent trajectories, thus displaying di erent links and a nities
Facial and expression recognition for the blind using computer vision
Gemstone Team FACEThe majority of communication between humans is comprised of nonverbal cues. Visual cues, such as expressions and nodding, are not readily accessible to the blind. Our team has developed an assistive device based on computer vision, which relays facial recognition and expression information to a blind user and is designed to store images of people the user frequently interacts with and analyze faces for expressions. The control a user has over his/her surroundings
while receiving real-time feedback contributes to a unique device. In order to design a device suited best to the blind user's needs, we engaged sighted and blind
participants in surveys and interviews to understand their views and preferences
regarding the availability of a computer vision system that, in real-time, can pro-
vide information about identity and expressions of humans. This thesis discusses
the development methodology, the selection of algorithms for recognizing faces
and expressions and physical designs of the device, and the results on subject
tests with blind participants to gauge the e ectiveness of our design
NON-VERBAL COMMUNICATION WITH PHYSIOLOGICAL SENSORS. THE AESTHETIC DOMAIN OF WEARABLES AND NEURAL NETWORKS
Historically, communication implies the transfer of information between bodies, yet this
phenomenon is constantly adapting to new technological and cultural standards. In a
digital context, it’s commonplace to envision systems that revolve around verbal modalities.
However, behavioural analysis grounded in psychology research calls attention to
the emotional information disclosed by non-verbal social cues, in particular, actions that
are involuntary. This notion has circulated heavily into various interdisciplinary computing
research fields, from which multiple studies have arisen, correlating non-verbal
activity to socio-affective inferences. These are often derived from some form of motion
capture and other wearable sensors, measuring the ‘invisible’ bioelectrical changes that
occur from inside the body.
This thesis proposes a motivation and methodology for using physiological sensory
data as an expressive resource for technology-mediated interactions. Initialised from a
thorough discussion on state-of-the-art technologies and established design principles
regarding this topic, then applied to a novel approach alongside a selection of practice
works to compliment this. We advocate for aesthetic experience, experimenting with
abstract representations. Atypically from prevailing Affective Computing systems, the
intention is not to infer or classify emotion but rather to create new opportunities for rich
gestural exchange, unconfined to the verbal domain.
Given the preliminary proposition of non-representation, we justify a correspondence
with modern Machine Learning and multimedia interaction strategies, applying an iterative,
human-centred approach to improve personalisation without the compromising
emotional potential of bodily gesture. Where related studies in the past have successfully
provoked strong design concepts through innovative fabrications, these are typically limited
to simple linear, one-to-one mappings and often neglect multi-user environments;
we foresee a vast potential. In our use cases, we adopt neural network architectures to
generate highly granular biofeedback from low-dimensional input data.
We present the following proof-of-concepts: Breathing Correspondence, a wearable
biofeedback system inspired by Somaesthetic design principles; Latent Steps, a real-time auto-encoder to represent bodily experiences from sensor data, designed for dance performance;
and Anti-Social Distancing Ensemble, an installation for public space interventions,
analysing physical distance to generate a collective soundscape. Key findings are
extracted from the individual reports to formulate an extensive technical and theoretical
framework around this topic. The projects first aim to embrace some alternative perspectives
already established within Affective Computing research. From here, these concepts
evolve deeper, bridging theories from contemporary creative and technical practices with
the advancement of biomedical technologies.Historicamente, os processos de comunicação implicam a transferência de informação
entre organismos, mas este fenómeno está constantemente a adaptar-se a novos padrões
tecnológicos e culturais. Num contexto digital, é comum encontrar sistemas que giram
em torno de modalidades verbais. Contudo, a análise comportamental fundamentada
na investigação psicológica chama a atenção para a informação emocional revelada por
sinais sociais não verbais, em particular, acções que são involuntárias. Esta noção circulou
fortemente em vários campos interdisciplinares de investigação na área das ciências da
computação, dos quais surgiram múltiplos estudos, correlacionando a actividade nãoverbal
com inferências sócio-afectivas. Estes são frequentemente derivados de alguma
forma de captura de movimento e sensores “wearable”, medindo as alterações bioeléctricas
“invisíveis” que ocorrem no interior do corpo.
Nesta tese, propomos uma motivação e metodologia para a utilização de dados sensoriais
fisiológicos como um recurso expressivo para interacções mediadas pela tecnologia.
Iniciada a partir de uma discussão aprofundada sobre tecnologias de ponta e princípios
de concepção estabelecidos relativamente a este tópico, depois aplicada a uma nova abordagem,
juntamente com uma selecção de trabalhos práticos, para complementar esta.
Defendemos a experiência estética, experimentando com representações abstractas. Contrariamente
aos sistemas de Computação Afectiva predominantes, a intenção não é inferir
ou classificar a emoção, mas sim criar novas oportunidades para uma rica troca gestual,
não confinada ao domínio verbal.
Dada a proposta preliminar de não representação, justificamos uma correspondência
com estratégias modernas de Machine Learning e interacção multimédia, aplicando uma
abordagem iterativa e centrada no ser humano para melhorar a personalização sem o
potencial emocional comprometedor do gesto corporal. Nos casos em que estudos anteriores
demonstraram com sucesso conceitos de design fortes através de fabricações
inovadoras, estes limitam-se tipicamente a simples mapeamentos lineares, um-para-um,
e muitas vezes negligenciam ambientes multi-utilizadores; com este trabalho, prevemos
um potencial alargado. Nos nossos casos de utilização, adoptamos arquitecturas de redes
neurais para gerar biofeedback altamente granular a partir de dados de entrada de baixa dimensão.
Apresentamos as seguintes provas de conceitos: Breathing Correspondence, um sistema
de biofeedback wearable inspirado nos princípios de design somaestético; Latent
Steps, um modelo autoencoder em tempo real para representar experiências corporais
a partir de dados de sensores, concebido para desempenho de dança; e Anti-Social Distancing
Ensemble, uma instalação para intervenções no espaço público, analisando a
distância física para gerar uma paisagem sonora colectiva. Os principais resultados são
extraídos dos relatórios individuais, para formular um quadro técnico e teórico alargado
para expandir sobre este tópico. Os projectos têm como primeiro objectivo abraçar algumas
perspectivas alternativas às que já estão estabelecidas no âmbito da investigação
da Computação Afectiva. A partir daqui, estes conceitos evoluem mais profundamente,
fazendo a ponte entre as teorias das práticas criativas e técnicas contemporâneas com o
avanço das tecnologias biomédicas
Recommended from our members
The development of a process for the production of textiles with fully embedded electronics
Many attempts to combine Electronics and Textiles have been realised for many years now. At the beginning with the introduction of conductive wires, then with the introduction of sensors and more complex circuits onto an everyday garment. The next step of evolution of combining these seemingly different fields is to integrate the electronics inside a textile structure, so that it will provide a seamless implementation of both worlds into everyday life. The microelectronics, mechanical, electrical, computing and chemical engineering advances of the last years, can ensure that, nowadays, this is feasible. Because of the minuscule dimensions of the electronic components, so that can be integrated inside the thin-by-nature yarn, and the necessity of a flexible and bendable structure overall, the task required is not of a small scale and has no prerequisite. This Thesis provides the backbone of an innovative technique to achieve the above goal in an automated or semi-automated, accurate, repeatable, reliable and time-cost effective way, combining all the required procedures, outlining the issues and proposing solutions on a plethora of them.
This research's outcome, after both manual and automated implementation of the microelectronic component encapsulation concept, proves that automation of the process is feasible with more research and funding in the future. Because this is an innovative and challenging in its implementation, as far as the tiny dimensions of the electronic components are concerned, more testing and physical implementation must be conducted with the contribution of a team of people from different disciplines, in order to finalise it and produce the first linear and continuous version of the machine that can automatically produce electronic yarns, i.e. yarn with electronic components inside its core.
The importance of this Thesis is that it sets the foundations, guidelines and requirements for the development of an all-new manufacturing procedure and the creation of a new machine, i.e. the Electronic Yarn Machine -EYM- in the future
Haptics: Science, Technology, Applications
This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility
Enhanced Living Environments
This open access book was prepared as a Final Publication of the COST Action IC1303 “Algorithms, Architectures and Platforms for Enhanced Living Environments (AAPELE)”. The concept of Enhanced Living Environments (ELE) refers to the area of Ambient Assisted Living (AAL) that is more related with Information and Communication Technologies (ICT). Effective ELE solutions require appropriate ICT algorithms, architectures, platforms, and systems, having in view the advance of science and technology in this area and the development of new and innovative solutions that can provide improvements in the quality of life for people in their homes and can reduce the financial burden on the budgets of the healthcare providers. The aim of this book is to become a state-of-the-art reference, discussing progress made, as well as prompting future directions on theories, practices, standards, and strategies related to the ELE area. The book contains 12 chapters and can serve as a valuable reference for undergraduate students, post-graduate students, educators, faculty members, researchers, engineers, medical doctors, healthcare organizations, insurance companies, and research strategists working in this area