2 research outputs found

    Co-adaptive multimodal interface guided by real-time multisensory stochastic feedback

    No full text
    In this work, we present new data-types, analytics, and human-computer interfaces as a platform to enable a new type of co-adaptive-behavioural analyses to track neuroplasticity. We exhibit seven different works, all of which are steps in creating an interface that “collaborates” in a closed-loop formula with the sensory-motor system in order to augment existing or substitute lost sensations. Such interfaces are beneficial as they enable the systems to adapt and evolve based on the participants’ rate of adaptation and preferences in evolution, ultimately steering the system towards favorable regimes. We started by trying to address the question: textit{"how does our novel sensory-motor system learn and adapt to changes?"}. In a pointing task, subjects had to discover and learn the sequence of the points presented on the screen (which was repetitive) and familiarise themselves with a non-predicted event (which occurred occasionally). In this very first study, we examined the learnability of motor system across seven individuals, and we investigated the learning patterns of each individual. Then, we explored how other bodily signals, such as temperature, are affecting movement. At this point, we conducted two studies. In the first one, we looked into the impact of the temperature range in the quality of the performed movement. This study was conducted in 40 individuals, 20 Schizophrenia patients, known to have temperature irregularities, and 20 controls. We identified the differences between the two populations in the range of temperature and the stochastic signatures of their kinematic data. To have a better look into the relation of movement and temperature, we conducted a second study utilizing data of a pre-professional ballet student recorded during her 6h training and her follow up sleep. For this study, we designed a new data type that allows us to examine movement as a function of temperature and see how each degree of temperature impacts the fluctuations in movement. This new data structure could be used for the integration of any bodily signal. Next, we identified the need to build visualization tools that could picture in real-time sensory information extracted from the analysis that would be informative to the participant. Such tools could be used in a vision-driven co-adaptive interface. For this reason, we designed an in-Matlab avatar that enables us to color-code sensory information to the corresponding body parts of the participant. In our next study, we examined two college-age individuals (a control and an Asperger syndrome) under sensory modalities and preferences. We built methods to extract for each individual the preferred sensory modality from the motor stream, textbf{selectivity}, and preferences of the particular modality that motivate the system to perform at its best, textbf{preferability}. These two parameters were critical to finally close the loop by letting the system decide upon the individual preferences. Therefore, we moved from the open-loop approach, to which all the so-far described studies belong to, into the closed loop approach. Firstly we study a natural closed-loop interface established by the dyadic interaction of two ballet dancers while rehearsing. In this natural paradigm, the closed-loop coadaptation happens through the touches and the pushes that dancers apply on each other in order to co-ordinate, kinesthetic adaptation. Therefore, we applied network connectivity metrics and extracted information such as underlying synergies, leading, and lagging body-parts to name a few. Such tools could be used in a vison-driven co-adaptive interfaces to evaluate the interaction between the participant and the displayed avatar. Finally, we built an artificial audio-driven co-adaptive interface which can track the adaptation and progress of the individual and intelligently steer the system towards the preferred and motivational conditions of the participant. For this conducted study, we utilized the heart rate of a salsa dancer to adjust the tempo of the music. The study showed that such a system can steer the stochastic signatures even of the heart (autonomic signal) creating strong evidence that we can guide the human system towards desire regimes.Ph.D.Includes bibliographical referencesby Vilelmini Kalampratsido

    Peripheral Network Connectivity Analyses for the Real-Time Tracking of Coupled Bodies in Motion

    No full text
    Dyadic interactions are ubiquitous in our lives, yet they are highly challenging to study. Many subtle aspects of coupled bodily dynamics continuously unfolding during such exchanges have not been empirically parameterized. As such, we have no formal statistical methods to describe the spontaneously self-emerging coordinating synergies within each actor’s body and across the dyad. Such cohesive motion patterns self-emerge and dissolve largely beneath the awareness of the actors and the observers. Consequently, hand coding methods may miss latent aspects of the phenomena. The present paper addresses this gap and provides new methods to quantify the moment-by-moment evolution of self-emerging cohesiveness during highly complex ballet routines. We use weighted directed graphs to represent the dyads as dynamically coupled networks unfolding in real-time, with activities captured by a grid of wearable sensors distributed across the dancers’ bodies. We introduce new visualization tools, signal parameterizations, and a statistical platform that integrates connectivity metrics with stochastic analyses to automatically detect coordination patterns and self-emerging cohesive coupling as they unfold in real-time. Potential applications of these new techniques are discussed in the context of personalized medicine, basic research, and the performing arts
    corecore