Presented at the 15th International Conference on Auditory Display (ICAD2009), Copenhagen, Denmark, May 18-22, 2009Human aural system is arguably one of the most refined sensor we posess. It is sensitive to such highly complex stimuli as conversa- tions or musical pieces. Be it a speaking voice or a band playing live, we are able to easily perceive relaxed or agitated states in an auditory stream. In turn, our own state of agitation can now be detected via electroencephalography technologies. In this pa- per we propose to explore both ideas in the form of a framework for conscious learning of relaxation through sonic feedback. Af- ter presenting the general paradigm of neurofeedback, we describe a set of tools to analyze electroencephalogram (EEG) data in real- time and we introduce a carefully designed, perceptually-grounded interactive music feedback system that helps the listener keeping track of and modulate her agitation state as measured by EEG