4 research outputs found

    Stochastic signatures of involuntary head micro-movements can be used to classify females of ABIDE into different subtypes of neurodevelopmental disorders.

    Get PDF
    © 2017 Torres, Mistry, Caballero and Whyatt. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY).Background: The approximate 5:1 male to female ratio in clinical detection of Autism Spectrum Disorder (ASD) prevents research from characterizing the female phenotype. Current open access repositories [such as those in the Autism Brain Imaging Data Exchange (ABIDE I-II)] contain large numbers of females to help begin providing a new characterization of females on the autistic spectrum. Here we introduce new methods to integrate data in a scale-free manner from continuous biophysical rhythms of the nervous systems and discrete (ordinal) observational scores. Methods: New data-types derived from image-based involuntary head motions and personalized statistical platform were combined with a data-driven approach to unveil sub-groups within the female cohort. Further, to help refine the clinical DSM-based ASD vs. Asperger's Syndrome (AS) criteria, distributional analyses of ordinal score data from Autism Diagnostic Observation Schedule (ADOS)-based criteria were used on both the female and male phenotypes. Results: Separate clusters were automatically uncovered in the female cohort corresponding to differential levels of severity. Specifically, the AS-subgroup emerged as the most severely affected with an excess level of noise and randomness in the involuntary head micro-movements. Extending the methods to characterize males of ABIDE revealed ASD-males to be more affected than AS-males. A thorough study of ADOS-2 and ADOS-G scores provided confounding results regarding the ASD vs. AS male comparison, whereby the ADOS-2 rendered the AS-phenotype worse off than the ASD-phenotype, while ADOS-G flipped the results. Females with AS scored higher on severity than ASD-females in all ADOS test versions and their scores provided evidence for significantly higher severity than males. However, the statistical landscapes underlying female and male scores appeared disparate. As such, further interpretation of the ADOS data seems problematic, rather suggesting the critical need to develop an entirely new metric to measure social behavior in females. Conclusions: According to the outcome of objective, data-driven analyses and subjective clinical observation, these results support the proposition that the female phenotype is different. Consequently the “social behavioral male ruler” will continue to mask the female autistic phenotype. It is our proposition that new observational behavioral tests ought to contain normative scales, be statistically sound and combined with objective data-driven approaches to better characterize the females across the human lifespan.Peer reviewe

    Co-adaptive multimodal interface guided by real-time multisensory stochastic feedback

    No full text
    In this work, we present new data-types, analytics, and human-computer interfaces as a platform to enable a new type of co-adaptive-behavioural analyses to track neuroplasticity. We exhibit seven different works, all of which are steps in creating an interface that “collaborates” in a closed-loop formula with the sensory-motor system in order to augment existing or substitute lost sensations. Such interfaces are beneficial as they enable the systems to adapt and evolve based on the participants’ rate of adaptation and preferences in evolution, ultimately steering the system towards favorable regimes. We started by trying to address the question: textit{"how does our novel sensory-motor system learn and adapt to changes?"}. In a pointing task, subjects had to discover and learn the sequence of the points presented on the screen (which was repetitive) and familiarise themselves with a non-predicted event (which occurred occasionally). In this very first study, we examined the learnability of motor system across seven individuals, and we investigated the learning patterns of each individual. Then, we explored how other bodily signals, such as temperature, are affecting movement. At this point, we conducted two studies. In the first one, we looked into the impact of the temperature range in the quality of the performed movement. This study was conducted in 40 individuals, 20 Schizophrenia patients, known to have temperature irregularities, and 20 controls. We identified the differences between the two populations in the range of temperature and the stochastic signatures of their kinematic data. To have a better look into the relation of movement and temperature, we conducted a second study utilizing data of a pre-professional ballet student recorded during her 6h training and her follow up sleep. For this study, we designed a new data type that allows us to examine movement as a function of temperature and see how each degree of temperature impacts the fluctuations in movement. This new data structure could be used for the integration of any bodily signal. Next, we identified the need to build visualization tools that could picture in real-time sensory information extracted from the analysis that would be informative to the participant. Such tools could be used in a vision-driven co-adaptive interface. For this reason, we designed an in-Matlab avatar that enables us to color-code sensory information to the corresponding body parts of the participant. In our next study, we examined two college-age individuals (a control and an Asperger syndrome) under sensory modalities and preferences. We built methods to extract for each individual the preferred sensory modality from the motor stream, textbf{selectivity}, and preferences of the particular modality that motivate the system to perform at its best, textbf{preferability}. These two parameters were critical to finally close the loop by letting the system decide upon the individual preferences. Therefore, we moved from the open-loop approach, to which all the so-far described studies belong to, into the closed loop approach. Firstly we study a natural closed-loop interface established by the dyadic interaction of two ballet dancers while rehearsing. In this natural paradigm, the closed-loop coadaptation happens through the touches and the pushes that dancers apply on each other in order to co-ordinate, kinesthetic adaptation. Therefore, we applied network connectivity metrics and extracted information such as underlying synergies, leading, and lagging body-parts to name a few. Such tools could be used in a vison-driven co-adaptive interfaces to evaluate the interaction between the participant and the displayed avatar. Finally, we built an artificial audio-driven co-adaptive interface which can track the adaptation and progress of the individual and intelligently steer the system towards the preferred and motivational conditions of the participant. For this conducted study, we utilized the heart rate of a salsa dancer to adjust the tempo of the music. The study showed that such a system can steer the stochastic signatures even of the heart (autonomic signal) creating strong evidence that we can guide the human system towards desire regimes.Ph.D.Includes bibliographical referencesby Vilelmini Kalampratsido

    Peripheral Network Connectivity Analyses for the Real-Time Tracking of Coupled Bodies in Motion

    No full text
    Dyadic interactions are ubiquitous in our lives, yet they are highly challenging to study. Many subtle aspects of coupled bodily dynamics continuously unfolding during such exchanges have not been empirically parameterized. As such, we have no formal statistical methods to describe the spontaneously self-emerging coordinating synergies within each actor’s body and across the dyad. Such cohesive motion patterns self-emerge and dissolve largely beneath the awareness of the actors and the observers. Consequently, hand coding methods may miss latent aspects of the phenomena. The present paper addresses this gap and provides new methods to quantify the moment-by-moment evolution of self-emerging cohesiveness during highly complex ballet routines. We use weighted directed graphs to represent the dyads as dynamically coupled networks unfolding in real-time, with activities captured by a grid of wearable sensors distributed across the dancers’ bodies. We introduce new visualization tools, signal parameterizations, and a statistical platform that integrates connectivity metrics with stochastic analyses to automatically detect coordination patterns and self-emerging cohesive coupling as they unfold in real-time. Potential applications of these new techniques are discussed in the context of personalized medicine, basic research, and the performing arts
    corecore