169 research outputs found

    Multimodal Closed-loop Human Machine Interaction

    Get PDF
    Großhauser T, Hermann T. Multimodal Closed-loop Human Machine Interaction. In: Bresin R, ed. Proceedings of the 3rd International workshop on Interactive Sonification. KTH, Stockholm, Sweden: KTH School of Computer Science and Communication (CSC); 2010: 59-63.The paper presents a multi-modal approach to tightly close the interaction loop between a human user and any tool in operation. Every activity of a human being generates multi-modal feedback, more or less related to the eyes (visual), the skin (sensory), the nose (olfactory) and the ears (auditive). Here we show the useful augmentation or complete creation of a nonexistent or less available feedback. As an example the performance of drilling tasks, line drawing tasks, or the complex task of bowing a violin can be considered

    Loss of the candidate tumor suppressor ZEB1 (TCF8, ZFHX1A) in SĂ©zary syndrome

    Get PDF
    Cutaneous T-cell lymphoma is a group of incurable extranodal non-Hodgkin lymphomas that develop from the skin-homing CD4+ T cell. Mycosis fungoides and SĂ©zary syndrome are the most common histological subtypes. Although next-generation sequencing data provided significant advances in the comprehension of the genetic basis of this lymphoma, there is not uniform consensus on the identity and prevalence of putative driver genes for this heterogeneous group of tumors. Additional studies may increase the knowledge about the complex genetic etiology characterizing this lymphoma. We used SNP6 arrays and GISTIC algorithm to prioritize a list of focal somatic copy-number alterations in a dataset of multiple sequential samples from 21 SĂ©zary syndrome patients. Our results confirmed a prevalence of significant focal deletions over amplifications: single well-known tumor suppressors, such as TP53, PTEN, and RB1, are targeted by these aberrations. In our cohort, ZEB1 (TCF8, ZFHX1A) spans a deletion having the highest level of significance. In a larger group of 43 patients, we found that ZEB1 is affected by deletions and somatic inactivating mutations in 46.5% of cases; also, we found potentially relevant ZEB1 germline variants. The survival analysis shows a worse clinical course for patients with ZEB1 biallelic inactivation. Multiple abnormal expression signatures were found associated with ZEB1 depletion in SĂ©zary patients we verified that ZEB1 exerts a role in oxidative response of SĂ©zary cells. Our data confirm the importance of deletions in the pathogenesis of cutaneous T-cell lymphoma. The characterization of ZEB1 abnormalities in SĂ©zary syndrome fulfils the criteria of a canonical tumor suppressor gene. Although additional confirmations are needed, our findings suggest, for the first time, that ZEB1 germline variants might contribute to the risk of developing this disease. Also, we provide evidence that ZEB1 activity in SĂ©zary cells, influencing the reactive oxygen species production, affects cell viability and apoptosis

    NordicSMC:A Nordic University Hub on Sound and Music Computing

    Get PDF
    Sound and music computing (SMC) is still an emerging field in many institutions, and the challenge is often to gain critical mass for developing study programs and undertake more ambitious research projects. We report on how a long-term collaboration between small and medium-sized SMC groups have led to an ambitious undertaking in the form of the Nordic Sound and Music Computing Network (NordicSMC), funded by the Nordic Research Council and institutions from all of the five Nordic countries (Denmark, Finland, Iceland, Norway, and Sweden). The constellation is unique in that it covers the field of sound and music from the “soft” to the “hard,” including the arts and humanities, the social and natural sciences, and engineering. This pa- per describes the goals, activities, and expected results of the network, with the aim of inspiring the creation of other joint efforts within the SMC community

    Sonification of the self vs. sonification of the other: Differences in the sonification of performed vs. observed simple hand movements

    Get PDF
    Existing works on interactive sonification of movements, i.e., the translation of human movement qualities from the physical to the auditory domain, usually adopt a predetermined approach: the way in which movement features modulate the characteristics of sound is fixed. In our work we want to go one step further and demonstrate that the user role can influence the tuning of the mapping between movement cues and sound parameters. Here, we aim to verify if and how the mapping changes when the user is either the performer or the observer of a series of body movements (tracing a square or an infinite shape with the hand in the air). We asked participants to tune movement sonification while they were directly performing the sonified movement vs. while watching another person performing the movement and listening to its sonification. Results show that the tuning of the sonification chosen by participants is influenced by three variables: role of the user (performer vs observer), movement quality (the amount of Smoothness and Directness in the movement), and physical parameters of the movements (velocity and acceleration). Performers focused more on the quality of their movement, while observers focused more on the sonic rendering, making it more expressive and more connected to low-level physical features

    Blood and skin-derived Sezary cells: differences in proliferation-index, activation of PI3K/AKT/mTORC1 pathway and its prognostic relevance

    Get PDF
    SĂ©zary syndrome (SS) is a rare and aggressive variant of Cutaneous T-Cell Lymphoma characterized by neoplastic distribution mainly involving blood, skin, and lymph-node. Although a role of the skin microenvironment in SS pathogenesis has long been hypothesized, its function in vivo is poorly characterized. To deepen this aspect, here we compared skin to blood-derived SS cells concurrently obtained from SS patients highlighting a greater proliferation-index and a PI3K/AKT/mTORC1 pathway activation level, particularly of mTOR protein, in skin-derived-SS cells. We proved that SDF-1 and CCL21 chemokines, both overexpressed in SS tissues, induce mTORC1 signaling activation, cell proliferation and Ki67 up-regulation in a SS-derived cell line and primary-SS cells. In a cohort of 43 SS cases, we observed recurrent copy number variations (CNV) of members belonging to this cascade, namely: loss of LKB1 (48%), PTEN (39%) and PDCD4 (35%) and gains of P70S6K (30%). These alterations represent druggable targets unraveling new therapeutic treatments as metformin here evaluated in vitro. Moreover, CNV of PTEN, PDCD4, and P70S6K, evaluated individually or in combination, are associated with reduced survival of SS patients. These data shed light on effects in vivo of skin-SS cells interaction underlying the prognostic and therapeutic relevance of mTORC1 pathway in SS

    Virtual virtuosity

    No full text
    This dissertation presents research in the field ofautomatic music performance with a special focus on piano. A system is proposed for automatic music performance, basedon artificial neural networks (ANNs). A complex,ecological-predictive ANN was designed thatlistensto the last played note,predictsthe performance of the next note,looksthree notes ahead in the score, and plays thecurrent tone. This system was able to learn a professionalpianist's performance style at the structural micro-level. In alistening test, performances by the ANN were judged clearlybetter than deadpan performances and slightly better thanperformances obtained with generative rules. The behavior of an ANN was compared with that of a symbolicrule system with respect to musical punctuation at themicro-level. The rule system mostly gave better results, butsome segmentation principles of an expert musician were onlygeneralized by the ANN. Measurements of professional pianists' performances revealedinteresting properties in the articulation of notes markedstaccatoandlegatoin the score. Performances were recorded on agrand piano connected to a computer.Staccatowas realized by a micropause of about 60% ofthe inter-onset-interval (IOI) whilelegatowas realized by keeping two keys depressedsimultaneously; the relative key overlap time was dependent ofIOI: the larger the IOI, the shorter the relative overlap. Themagnitudes of these effects changed with the pianists' coloringof their performances and with the pitch contour. Theseregularities were modeled in a set of rules for articulation inautomatic piano music performance. Emotional coloring of performances was realized by means ofmacro-rules implemented in the Director Musices performancesystem. These macro-rules are groups of rules that werecombined such that they reflected previous observations onmusical expression of specific emotions. Six emotions weresimulated. A listening test revealed that listeners were ableto recognize the intended emotional colorings. In addition, some possible future applications are discussedin the fields of automatic music performance, music education,automatic music analysis, virtual reality and soundsynthesis.QC 2010051

    Virtual virtuosity

    No full text
    This dissertation presents research in the field ofautomatic music performance with a special focus on piano. A system is proposed for automatic music performance, basedon artificial neural networks (ANNs). A complex,ecological-predictive ANN was designed thatlistensto the last played note,predictsthe performance of the next note,looksthree notes ahead in the score, and plays thecurrent tone. This system was able to learn a professionalpianist's performance style at the structural micro-level. In alistening test, performances by the ANN were judged clearlybetter than deadpan performances and slightly better thanperformances obtained with generative rules. The behavior of an ANN was compared with that of a symbolicrule system with respect to musical punctuation at themicro-level. The rule system mostly gave better results, butsome segmentation principles of an expert musician were onlygeneralized by the ANN. Measurements of professional pianists' performances revealedinteresting properties in the articulation of notes markedstaccatoandlegatoin the score. Performances were recorded on agrand piano connected to a computer.Staccatowas realized by a micropause of about 60% ofthe inter-onset-interval (IOI) whilelegatowas realized by keeping two keys depressedsimultaneously; the relative key overlap time was dependent ofIOI: the larger the IOI, the shorter the relative overlap. Themagnitudes of these effects changed with the pianists' coloringof their performances and with the pitch contour. Theseregularities were modeled in a set of rules for articulation inautomatic piano music performance. Emotional coloring of performances was realized by means ofmacro-rules implemented in the Director Musices performancesystem. These macro-rules are groups of rules that werecombined such that they reflected previous observations onmusical expression of specific emotions. Six emotions weresimulated. A listening test revealed that listeners were ableto recognize the intended emotional colorings. In addition, some possible future applications are discussedin the fields of automatic music performance, music education,automatic music analysis, virtual reality and soundsynthesis.QC 2010051

    What is the color of that music performance

    No full text
    The representation of expressivity in music is still a fairly unexplored field. Alternative ways of representing musical information are necessary when providing feedback on emotion expression in music such as in real-time tools for music education, or in the display of large music databases. One possible solution could be a graphical non-verbal representation of expressivity in music performance using color as index of emotion. To determine which colors are most suitable for an emotional expression, a test was run. Subjects rated how well each of 8 colors and their 3 nuances corresponds to each of 12 music performances expressing different emotions. Performances were played by professional musicians with 3 instruments, saxophone, guitar, and piano. Results show that subjects associated different hues to different emotions. Also, dark colors were associated to music in minor tonality and light colors to music in major tonality. Correspondence between spectrum energy and color hue are preliminary discussed. 1
    • 

    corecore