917 research outputs found

    Choreographic and Somatic Approaches for the Development of Expressive Robotic Systems

    Full text link
    As robotic systems are moved out of factory work cells into human-facing environments questions of choreography become central to their design, placement, and application. With a human viewer or counterpart present, a system will automatically be interpreted within context, style of movement, and form factor by human beings as animate elements of their environment. The interpretation by this human counterpart is critical to the success of the system's integration: knobs on the system need to make sense to a human counterpart; an artificial agent should have a way of notifying a human counterpart of a change in system state, possibly through motion profiles; and the motion of a human counterpart may have important contextual clues for task completion. Thus, professional choreographers, dance practitioners, and movement analysts are critical to research in robotics. They have design methods for movement that align with human audience perception, can identify simplified features of movement for human-robot interaction goals, and have detailed knowledge of the capacity of human movement. This article provides approaches employed by one research lab, specific impacts on technical and artistic projects within, and principles that may guide future such work. The background section reports on choreography, somatic perspectives, improvisation, the Laban/Bartenieff Movement System, and robotics. From this context methods including embodied exercises, writing prompts, and community building activities have been developed to facilitate interdisciplinary research. The results of this work is presented as an overview of a smattering of projects in areas like high-level motion planning, software development for rapid prototyping of movement, artistic output, and user studies that help understand how people interpret movement. Finally, guiding principles for other groups to adopt are posited.Comment: Under review at MDPI Arts Special Issue "The Machine as Artist (for the 21st Century)" http://www.mdpi.com/journal/arts/special_issues/Machine_Artis

    Performance monitoring during action observation and auditory lexical decisions

    Get PDF
    How does the brain monitor performances? Does expertise modulate this process? How does an observer’s error related activity differ from a performers own error related activity? How does ambiguity change the markers of error monitoring? In this thesis, I present two EEG studies and a commentary that sought to answer these questions. Both empirical studies concern performance monitoring in two different contexts and from two different personal perspectives, i.e. investigating the effects of expertise on electroencephalographic (EEG) neuromarkers of performance monitoring and in terms of monitoring own and others’ errors during actions and language processing. My first study focused on characterizing the electrophysiological responses in experts and control individuals while they are observing domain-specific actions in wheelchair basketball with correct and wrong outcomes (Chapter II). The aim of the commentary in the following chapter was to highlight the role of Virtual Reality approaches to error prediction during one’s own actions (Chapter III). The fourth chapter hypothesised that the error monitoring markers are present during both one’s own performance errors in a lexical decision task, and the observation of others’ performance errors (Chapter IV), however, the results suggested a further modulation of uncertainty created by our task design. The final chapter presents a general discussion that provides an overview of the results of my PhD work (Chapter V). The present chapter consists of a literature review in the leading frameworks of performance monitoring, action observation, visuo-motor expertise and language processing

    Dagstuhl News January - December 2008

    Get PDF
    "Dagstuhl News" is a publication edited especially for the members of the Foundation "Informatikzentrum Schloss Dagstuhl" to thank them for their support. The News give a summary of the scientific work being done in Dagstuhl. Each Dagstuhl Seminar is presented by a small abstract describing the contents and scientific highlights of the seminar as well as the perspectives or challenges of the research topic

    Virtual reality visual feedback and its effect on brain excitability

    Get PDF
    This dissertation examines manipulation of visual feedback in virtual reality (VR) to increase excitability of distinct neural networks in the sensorimotor cortex. The objective is to explore neural responses to visual feedback of motor activities performed in complex virtual environments during functional magnetic resonance imaging (fMRI), and to identify sensory manipulations that could further optimize VR rehabilitation of persons with hemiparesis. In addition, the effects of VR therapy on brain reorganization are investigated. An MRI-compatible VR system is used to provide subjects with online visual feedback of their hand movement. First, the author develops a protocol to analyze variability in movement kinematics between experimental sessions and conditions and its possible effect on modulating neural activity. Second, brain reorganization after 2 weeks of robot-assisted VR therapy is examined in 10 chronic stroke subjects in terms of change in extent of activation, interhemispheric dominance, connectivity network of ipsilesional primary motor cortex (iM1) and the interhemispheric interaction between iM 1 and contralesional M1 (cM 1). After training, brain activity during a simple paretic hand movement is re-localized in terms of bilateral change in activity or a shift of interhemispheric dominance (re-lateralization) toward the ipsilesional hemisphere that is positively correlated with improvement in clinical scores. Dynamic causal modeling (DCM) shows that interhemispheric coupling between the bilateral motor cortices tends to decrease after training and to negatively correlate with improvement in scores for clinical scales, and with the amount of re-lateralization. Third, the dissertation studies if visual discordance in VR of finger movement would facilitate activity in select brain networks. In a study of 12 healthy subjects, the amplitude of finger movement is manipulated (hypometric feedback) resulting in higher activation of contralateral M1. In a group of 11 stroke subjects, bidirectional, hypometric and hypermetric,VR visual discordance is used. Both feedback conditions cause small increase in activity of the iM1 contralateral to movement and stronger recruitment of both posterior parietal cortices and the ipsilesional fusiform gyrus (iFBA). Fourth, the effect of mirrored-visual feedback on the activity of the ipsilesional sensorimotor cortex of stroke subjects is examined. While subjects move the non-paretic hand during the fMRI experiment, they receive either veridical feedback of the movement or a mirrored feedback. The results show recruitment of iM1 and both posterior parietal cortices during the mirrored feedback. Effective connectivity analysis show increase correlation of iM1 and contralesional SPL (cSPL) with iFBA suggesting a role of the latter in the evaluation of feedback and in visuomotor processing. DCM analysis shows increased modulation of iM1 by cSPL area during the mirrored feedback, an observation that proves the influence of visual feedback on modulating primary motor cortex activation. This dissertation provides evidence that it is possible to enhance brain excitability through manipulation of virtual reality feedback and that brain reorganization can result from just two weeks of VR training. These findings should be exploited in the design of neuroscience-based rehabilitation protocols that could enhance brain reorganization and motor recovery

    Eye quietness and quiet eye in expert and novice golf performance: an electrooculographic analysis

    Get PDF
    Quiet eye (QE) is the final ocular fixation on the target of an action (e.g., the ball in golf putting). Camerabased eye-tracking studies have consistently found longer QE durations in experts than novices; however, mechanisms underlying QE are not known. To offer a new perspective we examined the feasibility of measuring the QE using electrooculography (EOG) and developed an index to assess ocular activity across time: eye quietness (EQ). Ten expert and ten novice golfers putted 60 balls to a 2.4 m distant hole. Horizontal EOG (2ms resolution) was recorded from two electrodes placed on the outer sides of the eyes. QE duration was measured using a EOG voltage threshold and comprised the sum of the pre-movement and post-movement initiation components. EQ was computed as the standard deviation of the EOG in 0.5 s bins from –4 to +2 s, relative to backswing initiation: lower values indicate less movement of the eyes, hence greater quietness. Finally, we measured club-ball address and swing durations. T-tests showed that total QE did not differ between groups (p = .31); however, experts had marginally shorter pre-movement QE (p = .08) and longer post-movement QE (p < .001) than novices. A group × time ANOVA revealed that experts had less EQ before backswing initiation and greater EQ after backswing initiation (p = .002). QE durations were inversely correlated with EQ from –1.5 to 1 s (rs = –.48 - –.90, ps = .03 - .001). Experts had longer swing durations than novices (p = .01) and, importantly, swing durations correlated positively with post-movement QE (r = .52, p = .02) and negatively with EQ from 0.5 to 1s (r = –.63, p = .003). This study demonstrates the feasibility of measuring ocular activity using EOG and validates EQ as an index of ocular activity. Its findings challenge the dominant perspective on QE and provide new evidence that expert-novice differences in ocular activity may reflect differences in the kinematics of how experts and novices execute skills

    Measuring, analysing and artificially generating head nodding signals in dyadic social interaction

    Get PDF
    Social interaction involves rich and complex behaviours where verbal and non-verbal signals are exchanged in dynamic patterns. The aim of this thesis is to explore new ways of measuring and analysing interpersonal coordination as it naturally occurs in social interactions. Specifically, we want to understand what different types of head nods mean in different social contexts, how they are used during face-to-face dyadic conversation, and if they relate to memory and learning. Many current methods are limited by time-consuming and low-resolution data, which cannot capture the full richness of a dyadic social interaction. This thesis explores ways to demonstrate how high-resolution data in this area can give new insights into the study of social interaction. Furthermore, we also want to demonstrate the benefit of using virtual reality to artificially generate interpersonal coordination to test our hypotheses about the meaning of head nodding as a communicative signal. The first study aims to capture two patterns of head nodding signals – fast nods and slow nods – and determine what they mean and how they are used across different conversational contexts. We find that fast nodding signals receiving new information and has a different meaning than slow nods. The second study aims to investigate a link between memory and head nodding behaviour. This exploratory study provided initial hints that there might be a relationship, though further analyses were less clear. In the third study, we aim to test if interactive head nodding in virtual agents can be used to measure how much we like the virtual agent, and whether we learn better from virtual agents that we like. We find no causal link between memory performance and interactivity. In the fourth study, we perform a cross-experimental analysis of how the level of interactivity in different contexts (i.e., real, virtual, and video), impacts on memory and find clear differences between them
    • …
    corecore