9,421 research outputs found

    Frequency shifting approach towards textual transcription of heartbeat sounds

    Get PDF
    Auscultation is an approach for diagnosing many cardiovascular problems. Automatic analysis of heartbeat sounds and extraction of its audio features can assist physicians towards diagnosing diseases. Textual transcription allows recording a continuous heart sound stream using a text format which can be stored in very small memory in comparison with other audio formats. In addition, a text-based data allows applying indexing and searching techniques to access to the critical events. Hence, the transcribed heartbeat sounds provides useful information to monitor the behavior of a patient for the long duration of time. This paper proposes a frequency shifting method in order to improve the performance of the transcription. The main objective of this study is to transfer the heartbeat sounds to the music domain. The proposed technique is tested with 100 samples which were recorded from different heart diseases categories. The observed results show that, the proposed shifting method significantly improves the performance of the transcription

    Short-segment heart sound classification using an ensemble of deep convolutional neural networks

    Get PDF
    This paper proposes a framework based on deep convolutional neural networks (CNNs) for automatic heart sound classification using short-segments of individual heart beats. We design a 1D-CNN that directly learns features from raw heart-sound signals, and a 2D-CNN that takes inputs of two- dimensional time-frequency feature maps based on Mel-frequency cepstral coefficients (MFCC). We further develop a time-frequency CNN ensemble (TF-ECNN) combining the 1D-CNN and 2D-CNN based on score-level fusion of the class probabilities. On the large PhysioNet CinC challenge 2016 database, the proposed CNN models outperformed traditional classifiers based on support vector machine and hidden Markov models with various hand-crafted time- and frequency-domain features. Best classification scores with 89.22% accuracy and 89.94% sensitivity were achieved by the ECNN, and 91.55% specificity and 88.82% modified accuracy by the 2D-CNN alone on the test set.Comment: 8 pages, 1 figure, conferenc

    Bipedal steps in the development of rhythmic behavior in humans

    No full text
    We contrast two related hypotheses of the evolution of dance: H1: Maternal bipedal walking influenced the fetal experience of sound and associated movement patterns; H2: The human transition to bipedal gait produced more isochronous/predictable locomotion sound resulting in early music-like behavior associated with the acoustic advantages conferred by moving bipedally in pace. The cadence of walking is around 120 beats per minute, similar to the tempo of dance and music. Human walking displays long-term constancies. Dyads often subconsciously synchronize steps. The major amplitude component of the step is a distinctly produced beat. Human locomotion influences, and interacts with, emotions, and passive listening to music activates brain motor areas. Across dance-genres the footwork is most often performed in time to the musical beat. Brain development is largely shaped by early sensory experience, with hearing developed from week 18 of gestation. Newborns reacts to sounds, melodies, and rhythmic poems to which they have been exposed in utero. If the sound and vibrations produced by footfalls of a walking mother are transmitted to the fetus in coordination with the cadence of the motion, a connection between isochronous sound and rhythmical movement may be developed. Rhythmical sounds of the human mother locomotion differ substantially from that of nonhuman primates, while the maternal heartbeat heard is likely to have a similar isochronous character across primates, suggesting a relatively more influential role of footfall in the development of rhythmic/musical abilities in humans. Associations of gait, music, and dance are numerous. The apparent absence of musical and rhythmic abilities in nonhuman primates, which display little bipedal locomotion, corroborates that bipedal gait may be linked to the development of rhythmic abilities in humans. Bipedal stimuli in utero may primarily boost the ontogenetic development. The acoustical advantage hypothesis proposes a mechanism in the phylogenetic development

    Coming from the heart: heart rate synchronization through sound

    Get PDF

    Spatial audio in small display screen devices

    Get PDF
    Our work addresses the problem of (visual) clutter in mobile device interfaces. The solution we propose involves the translation of technique-from the graphical to the audio domain-for expliting space in information representation. This article presents an illustrative example in the form of a spatialisedaudio progress bar. In usability tests, participants performed background monitoring tasks significantly more accurately using this spatialised audio (a compared with a conventional visual) progress bar. Moreover, their performance in a simultaneously running, visually demanding foreground task was significantly improved in the eye-free monitoring condition. These results have important implications for the design of multi-tasking interfaces for mobile devices

    Resonating Experiences of Self and Others enabled by a Tangible Somaesthetic Design

    Get PDF
    Digitalization is penetrating every aspect of everyday life including a human's heart beating, which can easily be sensed by wearable sensors and displayed for others to see, feel, and potentially "bodily resonate" with. Previous work in studying human interactions and interaction designs with physiological data, such as a heart's pulse rate, have argued that feeding it back to the users may, for example support users' mindfulness and self-awareness during various everyday activities and ultimately support their wellbeing. Inspired by Somaesthetics as a discipline, which focuses on an appreciation of the living body's role in all our experiences, we designed and explored mobile tangible heart beat displays, which enable rich forms of bodily experiencing oneself and others in social proximity. In this paper, we first report on the design process of tangible heart displays and then present results of a field study with 30 pairs of participants. Participants were asked to use the tangible heart displays during watching movies together and report their experience in three different heart display conditions (i.e., displaying their own heart beat, their partner's heart beat, and watching a movie without a heart display). We found, for example that participants reported significant effects in experiencing sensory immersion when they felt their own heart beats compared to the condition without any heart beat display, and that feeling their partner's heart beats resulted in significant effects on social experience. We refer to resonance theory to discuss the results, highlighting the potential of how ubiquitous technology could utilize physiological data to provide resonance in a modern society facing social acceleration.Comment: 18 page
    corecore