1,021 research outputs found

    Spontaneous Intrapersonal Synchrony and the Effect of Cognitive Load

    Get PDF
    Spontaneous intrapersonal synchronization is the spontaneous synchronization of periodic behaviors within an individual. It is less investigated than spontaneous interpersonal synchronization, the synchronization of periodic behaviors that occurs spontaneously between individuals integrated into a single system through coupling, caused by the exchange of sensory feedback between them. It was therefore hypothesized that periodic behaviors produced by an individual, a single system by default, would spontaneously be more synchronous through exchange of sensory feedback, coupling and integration within the individual, when the behaviors are produced simultaneously, compared to separately. Based on a postulate that explains spontaneous interpersonal synchronization as a strategy by the brain to conserve resources, and predicts individuals under high cognitive load to spontaneously synchronize their behaviors with others to conserve resources, it was hypothesized that spontaneous intrapersonal synchronization would increase under additional cognitive load. We tested our hypotheses through two experiments, each with a different pair of periodic tasks, and a different cognitive load task. In each experiment, we compared the phase coherence of two periodic tasks, tapping-walking or tapping-ticking, when produced by an individual simultaneously versus separately; we also compared the same when produced simultaneously with additional cognitive load versus without load. Here, ticking was a periodic task where the word “tick” was uttered repetitively. Counting backwards and visual pattern-matching were used as cognitive load tasks. Results showed that spontaneous intrapersonal synchronization between periodic tasks was higher when produced simultaneously, compared to separately, and the same was lower with additional cognitive load, compared to without load

    Analysing multi-person timing in music and movement : event based methods

    Get PDF
    Accurate timing of movement in the hundreds of milliseconds range is a hallmark of human activities such as music and dance. Its study requires accurate measurement of the times of events (often called responses) based on the movement or acoustic record. This chapter provides a comprehensive over - view of methods developed to capture, process, analyse, and model individual and group timing [...] This chapter is structured in five main sections, as follows. We start with a review of data capture methods, working, in turn, through a low cost system to research simple tapping, complex movements, use of video, inertial measurement units, and dedicated sensorimotor synchronisation software. This is followed by a section on music performance, which includes topics on the selection of music materials, sound recording, and system latency. The identification of events in the data stream can be challenging and this topic is treated in the next section, first for movement then for music. Finally, we cover methods of analysis, including alignment of the channels, computation of between channel asynchrony errors and modelling of the data set

    Songs Search Using Human Humming Voice

    Get PDF
    The system is developed to find songs stored in the database using human humming voice, whereby a sample of the humming voice is compared to songs stored in the system. The main function of the system is to find songs only by humming to the melody of the song. The scopes for this project are human humming voice, voice capture in WA V format, songs database, and MIDI file comparing algorithm. Methodologies used in this system are based on system analysis and design methodology comprising planning, analysis, design and implementation. Java programming language is used to build the system. The system has the functionality of humming voice recording and algorithms comparing both humming voice and song files in the system to fmd the right song. The intended result of this system is to display the titles of the song and similarity percentage between humming voice melody and songs in the system

    Care-Chair: Opportunistic health assessment with smart sensing on chair backrest

    Get PDF
    A vast majority of the population spend most of their time in a sedentary position, which potentially makes a chair a huge source of information about a person\u27s daily activity. This information, which often gets ignored, can reveal important health data but the overhead and the time consumption needed to track the daily activity of a person is a major hurdle. Considering this, a simple and cost-efficient sensory system, named Care-Chair, with four square force sensitive resistors on the backrest of a chair has been designed to collect the activity details and breathing rate of the users. The Care-Chair system is considered as an opportunistic environmental sensor that can track each and every activity of its occupant without any human intervention. It is specifically designed and tested for elderly people and people with sedentary job. The system was tested using 5 users data for the sedentary activity classification and it successfully classified 18 activities in laboratory environment with 86% accuracy. In an another experiment of breathing rate detection with 19 users data, the Care-Chair produced precise results with slight variance with ground truth breathing rate. The Care-Chair yields contextually good results when tested in uncontrolled environment with single user data collected during long hours of study. --Abstract, page iii

    Advanced and natural interaction system for motion-impaired users

    Get PDF
    Human-computer interaction is an important area that searches for better and more comfortable systems to promote communication between humans and machines. Vision-based interfaces can offer a more natural and appealing way of communication. Moreover, it can help in the e-accessibility component of the e-inclusion. The aim is to develop a usable system, that is, the end-user must consider the use of this device effective, efficient and satisfactory. The research's main contribution is SINA, a hands-free interface based on computer vision techniques for motion impaired users. This interface does not require the user to use his upper body limbs, as only nose motion is considered. Besides the technical aspect, user's satisfaction when using an interface is a critical issue. The approach that we have adopted is to integrate usability evaluation at relevant points of the software developmen

    Energy flows in gesture-speech physics: The respiratory-vocal system and its coupling with hand gestures

    No full text
    Expressive moments in communicative hand gestures often align with emphatic stress in speech. It has recently been found that acoustic markers of emphatic stress arise naturally during steady-state phonation when upper-limb movements impart physical impulses on the body, most likely affecting acoustics via respiratory activity. In this confirmatory study, participants (N = 29) repeatedly uttered consonant-vowel (/pa/) mono-syllables while moving in particular phase relations with speech, or not moving the upper limbs. This study shows that respiration-related activity is affected by (especially high-impulse) gesturing when vocalizations occur near peaks in physical impulse. This study further shows that gesture-induced moments of bodily impulses increase the amplitude envelope of speech, while not similarly affecting the Fundamental Frequency (F0). Finally, tight relations between respiration-related activity and vocalization were observed, even in the absence of movement, but even more so when upper-limb movement is present. The current findings expand a developing line of research showing that speech is modulated by functional biomechanical linkages between hand gestures and the respiratory system. This identification of gesture-speech biomechanics promises to provide an alternative phylogenetic, ontogenetic, and mechanistic explanatory route of why communicative upper limb movements co-occur with speech in humans. ACKNOWLEDGMENT

    Facilitating joint attention with salient pointing in interactions involving children with autism spectrum disorder

    Get PDF
    Children with autism spectrum disorder (ASD) reportedly have difficulties in responding to bids for joint attention, notably in following pointing gestures. Previous studies have predominantly built on structured observation measures and predefined coding categories to measure children’s responsiveness to gestures. However, how these gestures are designed and what detailed interactional work they can accomplish have received less attention. In this paper, we use a multimodal approach to conversation analysis (CA) to investigate how educators design their use of pointing in interactions involving school-aged children with ASD or autistic features. The analysis shows that pointing had specific sequential implications for the children beyond mere attention sharing. Occasionally, the co-occurring talk and pointing led to ambiguities when a child was interpreting their interactional connotations, specifically when the pointing gesture lacked salience. The study demonstrates that the CA approach can increase understanding of how to facilitate the establishment of joint attention

    Temporal expectancies driven by self- and externally generated rhythms

    Get PDF
    The dynamic attending theory proposes that rhythms entrain periodic fluctuations of attention which modulate the gain of sensory input. However, temporal expectancies can also be driven by the mere passage of time (foreperiod effect). It is currently unknown how these two types of temporal expectancy relate to each other, i.e. whether they work in parallel and have distinguishable neural signatures. The current research addresses this issue. Participants either tapped a 1Hz rhythm (active task) or were passively presented with the same rhythm using tactile stimulators (passive task). Based on this rhythm an auditory target was then presented early, in synchrony, or late. Behavioural results were in line with the dynamic attending theory as RTs were faster for in- compared to out-of-synchrony targets. Electrophysiological results suggested self-generated and externally induced rhythms to entrain neural oscillations in the delta frequency band. Auditory ERPs showed evidence of two distinct temporal expectancy processes. Both tasks demonstrated a pattern which followed a linear foreperiod effect. In the active task, however, we also observed an ERP effect consistent with the dynamic attending theory. This study shows that temporal expectancies generated by a rhythm and expectancy generated by the mere passage of time can work in parallel and sheds light on how these mechanisms are implemented in the brain

    Temporal expectancies driven by self- and externally generated rhythms

    Get PDF
    The dynamic attending theory proposes that rhythms entrain periodic fluctuations of attention which modulate the gain of sensory input. However, temporal expectancies can also be driven by the mere passage of time (foreperiod effect). It is currently unknown how these two types of temporal expectancy relate to each other, i.e. whether they work in parallel and have distinguishable neural signatures. The current research addresses this issue. Participants either tapped a 1Hz rhythm (active task) or were passively presented with the same rhythm using tactile stimulators (passive task). Based on this rhythm an auditory target was then presented early, in synchrony, or late. Behavioural results were in line with the dynamic attending theory as RTs were faster for in- compared to out-of-synchrony targets. Electrophysiological results suggested self-generated and externally induced rhythms to entrain neural oscillations in the delta frequency band. Auditory ERPs showed evidence of two distinct temporal expectancy processes. Both tasks demonstrated a pattern which followed a linear foreperiod effect. In the active task, however, we also observed an ERP effect consistent with the dynamic attending theory. This study shows that temporal expectancies generated by a rhythm and expectancy generated by the mere passage of time can work in parallel and sheds light on how these mechanisms are implemented in the brain
    • …
    corecore