Location of Repository

Emotionally expressive music based interaction language for social robots

By 1972- Aladdin Ayesh
Year: 2009
OAI identifier: oai:www.dora.dmu.ac.uk:2086/1126

Suggested articles

Preview

Citations

  1. (2006). A full-body gesture database for automatic gesture recognition. doi
  2. (1999). A gesture description model based on synthesizing fundamental gestures. doi
  3. (1987). A Guide to Musical Analysis. doi
  4. (2008). A millenson-based approach to emotion modelling. doi
  5. (2004). A proposal for the integration of symbolic music notation into multimedia frameworks. In Web Delivering of Music, doi
  6. (2000). A sound propagation model for interagents communication. doi
  7. (1982). Acoustic Communication in Birds. doi
  8. Active and dynamic information fusion for facial expression understanding from image sequences. doi
  9. (2006). An application of neural network for extracting arabic word roots. doi
  10. (1991). An Introduction to Formal Specification and Z. doi
  11. (1991). An Introduction to Formal Specificationand Z. doi
  12. (2006). Automatic mood detection and tracking of music audio signals. doi
  13. (2006). Automatic mood detection and tracking of music audio signals. Audio, Speech and Language Processing, doi
  14. (1994). Communication in reactive multiagent robotic systems. doi
  15. (2004). Computational analysis of mannerism gestures. doi
  16. (2005). Computational musicology: An artificial life approach. doi
  17. (2000). Cue utilization in communication of emotion in music performance: Relating performance to perception. doi
  18. (1797). Cue utilization in communication of emotion in music performance: Relating performance to perception. Journal of Experimental Psychology: Human Perception and Performance, doi
  19. Description and recognition of human gestures based on the transition of curvature from motion images. doi
  20. (2008). Eava: A 3d emotive audiovisual avatar. doi
  21. (2007). Eliciting requirements for a robotic toy for children with autism - results from user panels. doi
  22. (2007). Emotion interaction system for a service robot. doi
  23. (2004). Emotional analysis of facial expressions. In Systems, Man and Cybernetics, doi
  24. (2004). Emotionally motivated reinforcement learning based controller. In Systems, Man and Cybernetics, doi
  25. (2005). Exploring the use of structured musical stimuli to communicate simple diagrams: the role of context. doi
  26. (2005). Exploringtheuse of structured musical stimuli to communicate simple diagrams: the role of context. doi
  27. (1995). Expression of emotion in voice and music. doi
  28. (1995). Expressionofemotionin voiceand music.
  29. (2006). Facial expression recognition using kernel canonical correlation analysis (kcca). 17(1):233– 238, doi
  30. (2005). Genetic approaches for evolving form in musical composition.
  31. Gesture-based interaction and communication: automated classification of hand gesture contours. doi
  32. (2004). Grammar based music composition.
  33. (2008). Hand gesture recognition system using standard fuzzy c-means algorithm for recognizing hand gesture with angle variations for unsupervised users. In doi
  34. (2003). Hao Tang Asad Yousuf, William Lehman. Animatronics and emotional face displays of robots.
  35. (2008). Humanoid audio and visual avatar with emotive text-to-speech synthesis. 10(6):969– 981, doi
  36. (2004). Individualization of music similarity perception via feature subset selection. doi
  37. (2004). Individualizationof musicsimilarity perception via feature subset selection. doi
  38. (1991). Intelligence without reason. doi
  39. (1998). Issues in the design of emotional agents.
  40. (2003). Learning behavior-selection by emotions and cognition in a multi-goal robot task.
  41. (2005). Mirela: a musical robot. doi
  42. (2002). Musical program auralisation: a structured approach to motif design. doi
  43. (2006). Perceptual rhythm determination of music signal for emotion-based classification. doi
  44. (2006). Precise pitch profile feature extraction from musical audio for key detection. Multimedia, doi
  45. (2004). Real-time music generation for a virtual environment.
  46. (1998). Realtime gesture recognition under the multi-layered parallel recognition framework of qvips. doi
  47. Realtime gesture recognitionunder the multi-layered parallel recognition framework of qvips. doi
  48. (2004). Recognition of arm gestures using multiple orientation sensors: gesture classification. doi
  49. (1999). Reinforcement Learning in Autonomous Robots: An Empirical Investigation of the Role of Emotions.
  50. (1997). Saxex: a case-based reasoning system for generating expressive musical performances. doi
  51. (1995). Self reference in ai.
  52. (2005). Social interaction between robots, avatars & humans. doi
  53. (2006). Social robots Ð emotional agents: Some remarks on naturalizing man-machine interaction.
  54. (1987). Structural Functions in Music. doi
  55. (1987). Structural Functionsin Music.
  56. (1991). The animat path to AI.
  57. (2000). The art of rendering sounds from emergent behaviour: cellular automata granular synthesis. doi
  58. (2000). The Psychology of Emotion: Theories of Emotion in Perspective. doi
  59. (2006). The role of the experimenter in hri research: A case study evaluation of children with autism interacting with a robotic toy. doi
  60. (2006). Toward intelligent music information retrieval. doi
  61. (2006). Towards an evolution model of expressive music performance. doi
  62. (2004). User-adaptive music emotion recognition. doi
  63. (2002). Using music to communicate computing information. doi
  64. (2002). Using music to communicate computing information. Interacting with Computers, doi

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.