351 research outputs found
Interactive Spaces Natural interfaces supporting gestures and manipulations in interactive spaces
This doctoral dissertation focuses on the development of interactive spaces through the use of
natural interfaces based on gestures and manipulative actions. In the real world people use their
senses to perceive the external environment and they use manipulations and gestures to
explore the world around them, communicate and interact with other individuals. From this
perspective the use of natural interfaces that exploit the human sensorial and explorative
abilities helps filling the gap between physical and digital world.
In the first part of this thesis we describe the work made for improving interfaces and devices for
tangible, multi touch and free hand interactions. The idea is to design devices able to work also
in uncontrolled environments, and in situations where control is mostly of the physical type
where even the less experienced users can express their manipulative exploration and gesture
communication abilities.
We also analyze how it can be possible to mix these techniques to create an interactive space,
specifically designed for teamwork where the natural interfaces are distributed in order to
encourage collaboration.
We then give some examples of how these interactive scenarios can host various types of
applications facilitating, for instance, the exploration of 3D models, the enjoyment of multimedia
contents and social interaction.
Finally we discuss our results and put them in a wider context, focusing our attention particularly
on how the proposed interfaces actually improve people’s lives and activities and the interactive
spaces become a place of aggregation where we can pursue objectives that are both personal
and shared with others
Interactive Spaces Natural interfaces supporting gestures and manipulations in interactive spaces
This doctoral dissertation focuses on the development of interactive spaces through the use of
natural interfaces based on gestures and manipulative actions. In the real world people use their
senses to perceive the external environment and they use manipulations and gestures to
explore the world around them, communicate and interact with other individuals. From this
perspective the use of natural interfaces that exploit the human sensorial and explorative
abilities helps filling the gap between physical and digital world.
In the first part of this thesis we describe the work made for improving interfaces and devices for
tangible, multi touch and free hand interactions. The idea is to design devices able to work also
in uncontrolled environments, and in situations where control is mostly of the physical type
where even the less experienced users can express their manipulative exploration and gesture
communication abilities.
We also analyze how it can be possible to mix these techniques to create an interactive space,
specifically designed for teamwork where the natural interfaces are distributed in order to
encourage collaboration.
We then give some examples of how these interactive scenarios can host various types of
applications facilitating, for instance, the exploration of 3D models, the enjoyment of multimedia
contents and social interaction.
Finally we discuss our results and put them in a wider context, focusing our attention particularly
on how the proposed interfaces actually improve people’s lives and activities and the interactive
spaces become a place of aggregation where we can pursue objectives that are both personal
and shared with others
Gestures and cooperation: considering non verbal communication in the design of interactive spaces
This dissertation explores the role of gestures in computer supported collaboration. People make extensive use of non-verbal forms of communication when they interact with each other in everyday life: of these, gestures are relatively easy to observe and quantify. However, the role of gestures in human computer interaction so far has been focused mainly on using conventional signs like visible commands, rather than on exploiting all nuances of such natural human skill. We propose a perspective on natural interaction that builds on recent advances in tangible interaction, embodiment and computer supported collaborative work. We consider the social and cognitive aspects of gestures and manipulations to support our claim of a primacy of tangible and multi-touch interfaces, and describe our experiences focused on assessing the suitability of such interface paradigms to traditional application scenarios. We describe our design and prototype of an interactive space for group-work, in which natural interfaces, such as tangible user interfaces and multi-touch screens, are deployed so as to foster and encourage collaboration. We show that these interfaces can lead to an improvement in performances and that such improvements appear related to an increase of the gestures performed by the users. We also describe the progress on the state of the art that have been necessary to implement such tools on commodity hardware and deploy them in a relatively uncontrolled environment. Finally, we discuss our findings and frame them in the broader context of embodied interaction, drawing useful implications for interactions design, with emphasis on how to enhance the activity of people in their workplace, home, school, etc. supported in their individual and collaborative tasks by natural interfaces
Expanding tangible tabletop interfaces beyond the display
L’augment
de
popularitat
de
les
taules
i
superfícies
interactives
està
impulsant
la
recerca
i
la
innovació
en
una
gran
varietat
d’àrees,
incloent-‐hi
maquinari,
programari,
disseny
de
la
interacció
i
noves
tècniques
d’interacció.
Totes,
amb
l’objectiu
de
promoure
noves
interfícies
dotades
d’un
llenguatge
més
ric,
potent
i
natural.
Entre
totes
aquestes
modalitats,
la
interacció
combinada
a
sobre
i
per
damunt
de
la
superfície
de
la
taula
mitjançant
tangibles
i
gestos
és
actualment
una
àrea
molt
prometedora.
Aquest
document
tracta
d’expandir
les
taules
interactives
més
enllà
de
la
superfície
per
mitjà
de
l’exploració
i
el
desenvolupament
d’un
sistema
o
dispositiu
enfocat
des
de
tres
vessants
diferents:
maquinari,
programari
i
disseny
de
la
interacció.
Durant
l’inici
d’aquest
document
s’estudien
i
es
resumeixen
els
diferents
trets
característics
de
les
superfícies
interactives
tangibles
convencionals
o
2D
i
es
presenten
els
treballs
previs
desenvolupats
per
l’autor
en
solucions
de
programari
que
acaben
resultant
en
aplicacions
que
suggereixen
l’ús
de
la
tercera
dimensió
a
les
superfícies
tangibles.
Seguidament,
es
presenta
un
repàs
del
maquinari
existent
en
aquest
tipus
d’interfícies
per
tal
de
concebre
un
dispositiu
capaç
de
detectar
gestos
i
generar
visuals
per
sobre
de
la
superfície,
per
introduir
els
canvis
realitzats
a
un
dispositiu
existent,
desenvolupat
i
cedit
per
Microsoft
Reseach
Cambridge.
Per
tal
d’explotar
tot
el
potencial
d’aquest
nou
dispositiu,
es
desenvolupa
un
nou
sistema
de
visió
per
ordinador
que
estén
el
seguiment
d’objectes
i
mans
en
una
superfície
2D
a
la
detecció
de
mans,
dits
i
etiquetes
amb
sis
graus
de
llibertat
per
sobre
la
superfície
incloent-‐hi
la
interacció
tangible
i
tàctil
convencional
a
la
superfície.
Finalment,
es
presenta
una
eina
de
programari
per
a
generar
aplicacions
per
al
nou
sistema
i
es
presenten
un
seguit
d’aplicacions
per
tal
de
provar
tot
el
desenvolupament
generat
al
llarg
de
la
tesi
que
es
conclou
presentant
un
seguit
de
gestos
tant
a
la
superfície
com
per
sobre
d’aquesta
i
situant-‐los
en
una
nova
classificació
que
alhora
recull
la
interacció
convencional
2D
i
la
interacció
estesa
per
damunt
de
la
superfície
desenvolupada.The
rising
popularity
of
interactive
tabletops
and
surfaces
is
spawning
research
and
innovation
in
a
wide
variety
of
areas,
including
hardware
and
software
technologies,
interaction
design
and
novel
interaction
techniques,
all
of
which
seek
to
promote
richer,
more
powerful
and
more
natural
interaction
modalities.
Among
these
modalities,
combined
interaction
on
and
above
the
surface,
both
with
gestures
and
with
tangible
objects,
is
a
very
promising
area.
This
dissertation
is
about
expanding
tangible
and
tabletops
surfaces
beyond
the
display
by
exploring
and
developing
a
system
from
the
three
different
perspectives:
hardware,
software,
and
interaction
design.
This
dissertation,
studies
and
summarizes
the
distinctive
affordances
of
conventional
2D
tabletop
devices,
with
a
vast
literature
review
and
some
additional
use
cases
developed
by
the
author
for
supporting
these
findings,
and
subsequently
explores
the
novel
and
not
yet
unveiled
potential
affordances
of
3D-‐augmented
tabletops.
It
overviews
the
existing
hardware
solutions
for
conceiving
such
a
device,
and
applies
the
needed
hardware
modifications
to
an
existing
prototype
developed
and
rendered
to
us
by
Microsoft
Research
Cambridge.
For
accomplishing
the
interaction
purposes,
it
is
developed
a
vision
system
for
3D
interaction
that
extends
conventional
2D
tabletop
tracking
for
the
tracking
of
hand
gestures,
6DoF
markers
and
on-‐surface
finger
interaction.
It
finishes
by
conceiving
a
complete
software
framework
solution,
for
the
development
and
implementation
of
such
type
of
applications
that
can
benefit
from
these
novel
3D
interaction
techniques,
and
implements
and
test
several
software
prototypes
as
proof
of
concepts,
using
this
framework.
With
these
findings,
it
concludes
presenting
continuous
tangible
interaction
gestures
and
proposing
a
novel
classification
for
3D
tangible
and
tabletop
gestures
Evaluation of Physical Finger Input Properties for Precise Target Selection
The multitouch tabletop display provides a collaborative workspace for multiple users around a table. Users can perform direct and natural multitouch interaction to
select target elements using their bare fingers. However,
physical size of fingertip varies from one person to another
which generally introduces a fat finger problem. Consequently, it creates the imprecise selection of small size target elements during direct multitouch input.
In this respect, an attempt is made
to evaluate the physical finger input properties i.e. contact area and shape in the context of imprecise selection
Multi-touch Detection and Semantic Response on Non-parametric Rear-projection Surfaces
The ability of human beings to physically touch our surroundings has had a profound impact on our daily lives. Young children learn to explore their world by touch; likewise, many simulation and training applications benefit from natural touch interactivity. As a result, modern interfaces supporting touch input are ubiquitous. Typically, such interfaces are implemented on integrated touch-display surfaces with simple geometry that can be mathematically parameterized, such as planar surfaces and spheres; for more complicated non-parametric surfaces, such parameterizations are not available. In this dissertation, we introduce a method for generalizable optical multi-touch detection and semantic response on uninstrumented non-parametric rear-projection surfaces using an infrared-light-based multi-camera multi-projector platform. In this paradigm, touch input allows users to manipulate complex virtual 3D content that is registered to and displayed on a physical 3D object. Detected touches trigger responses with specific semantic meaning in the context of the virtual content, such as animations or audio responses. The broad problem of touch detection and response can be decomposed into three major components: determining if a touch has occurred, determining where a detected touch has occurred, and determining how to respond to a detected touch. Our fundamental contribution is the design and implementation of a relational lookup table architecture that addresses these challenges through the encoding of coordinate relationships among the cameras, the projectors, the physical surface, and the virtual content. Detecting the presence of touch input primarily involves distinguishing between touches (actual contact events) and hovers (near-contact proximity events). We present and evaluate two algorithms for touch detection and localization utilizing the lookup table architecture. One of the algorithms, a bounded plane sweep, is additionally able to estimate hover-surface distances, which we explore for interactions above surfaces. The proposed method is designed to operate with low latency and to be generalizable. We demonstrate touch-based interactions on several physical parametric and non-parametric surfaces, and we evaluate both system accuracy and the accuracy of typical users in touching desired targets on these surfaces. In a formative human-subject study, we examine how touch interactions are used in the context of healthcare and present an exploratory application of this method in patient simulation. A second study highlights the advantages of touch input on content-matched physical surfaces achieved by the proposed approach, such as decreases in induced cognitive load, increases in system usability, and increases in user touch performance. In this experiment, novice users were nearly as accurate when touching targets on a 3D head-shaped surface as when touching targets on a flat surface, and their self-perception of their accuracy was higher
Gestures and Interaction: exploiting natural abilities in the design of interactive systems
Collana seminari interni 2012, Number 20120606.This talk explores the role of gestures in computer supported collaboration. People make extensive use of non-verbal forms of communication when they interact with each other in everyday life: of these, gestures are relatively easy to observe and quantify. However, the role of gestures in human computer interaction so far has been focused mainly on using conventional signs like visible commands, rather than on exploiting all nuances of such natural human skill. We propose a perspective on natural interaction that builds on recent advances in tangible interaction, embodiment and computer supported collaborative work. We consider the social and cognitive aspects of gestures and manipulations to support our claim of a primacy of tangible and multi-touch interfaces, and describe our experiences focused on assessing the suitability of such interface paradigms to traditional application scenarios
Light on horizontal interactive surfaces: Input space for tabletop computing
In the last 25 years we have witnessed the rise and growth of interactive tabletop research, both in academic and in industrial settings. The rising demand for the digital support of human activities motivated the need to bring computational power to table surfaces. In this article, we review the state of the art of tabletop computing, highlighting core aspects that frame the input space of interactive tabletops: (a) developments in hardware technologies that have caused the proliferation of interactive horizontal surfaces and (b) issues related to new classes of interaction modalities (multitouch, tangible, and touchless). A classification is presented that aims to give a detailed view of the current development of this research area and define opportunities and challenges for novel touch- and gesture-based interactions between the human and the surrounding computational environment. © 2014 ACM.This work has been funded by Integra (Amper Sistemas and CDTI, Spanish Ministry of Science and Innovation) and TIPEx (TIN2010-19859-C03-01) projects and Programa de Becas y Ayudas para la Realización de Estudios Oficiales de Máster y Doctorado en la Universidad Carlos III de Madrid, 2010
- …