9 research outputs found

    Sonification Mapping Configurations: Pairings Of Real-Time Exhibits And Sound

    Get PDF
    Presented at the 19th International Conference on Auditory Display (ICAD2013) on July 6-9, 2013 in Lodz, Poland.Visitors to aquariums typically rely on their vision to interact with live exhibits that convey rich descriptive and aesthetic visual information. However, some visitors may prefer or need to have an alternative interpretation of the exhibitÕs visual scene to improve their experience. Musical sonification has been explored as an interpretive strategy for this purpose and related work provides some guidance for sonification design, yet more empirical work on developing and validating the music-to-visual scene mappings needs to be completed. This paper discusses work to validate mappings that were developed through an investigation of musician performances for two specific live animal exhibits at the Georgia Aquarium. In this proposed study, participants will provide feedback on musical mapping examples which will help inform design of a real-time sonification system for aquarium exhibits. Here, we describe our motivation, methods, and expected contributions

    Comprehension of Sonified Weather Data Across Multiple Auditory Streams

    Get PDF
    Presented at the 20th International Conference on Auditory Display (ICAD2014), June 22-25, 2014, New York, NY.Weather data has been one of the mainstays in sonification research. It is readily available, and every listener has presumably had some form of experience with meteorological events to draw from. When we want to use this type of complex data in a scenario such as in a classroom, we need to be sure that listeners are able to correctly comprehend the intended information. The current study proposes a method for evaluating the usability of complex sonifications that contain multiple data sets, especially for tasks that require inferences to be made through comparisons across multiple data streams. This extended abstract outlines a study that will address this issue by asking participants to listen to sonifications and then respond with a description of general understanding about what variables changed, and how said changes would physically be represented by real weather conditions

    A sonification of Kepler space telescope star data

    Get PDF
    Presented at the 18th International Conference on Auditory Display (ICAD2012) on June 18-21, 2012 in Atlanta, Georgia.Reprinted by permission of the International Community for Auditory Display, http://www.icad.org.A performing artist group interested in including a sonification of star data from NASA’s Kepler space telescope in their next album release approached the Georgia Tech Sonification Lab for assistance in the process. The artists had few constraints for the authors other than wanting the end product to be true to the data, and a musically appealing “heavenly” sound. Several sonifications of the data were created using various techniques, each resulting in a different sounding representation of the Kepler data. The details of this process are discussed in this poster. Ultimately, the researchers were able to produce the desired sounds via sound synthesis, and the artists plan to incorporate them into their next album release

    Aquarium fugue: interactive sonification for children and visually impaired audience in informal learning environments

    Get PDF
    Presented at the 18th International Conference on Auditory Display (ICAD2012) on June 18-21, 2012 in Atlanta, Georgia.Reprinted by permission of the International Community for Auditory Display, http://www.icad.org.In response to the need for more accessible Informal Learning Environments (ILEs), the Georgia Tech Accessible Aquarium Project has been studying sonification for the use in live exhibit interpretation in aquariums. The present work attempts to add more interactivity [1] to the project’s existing sonification work, which is expected to lead to more accessible learning opportunities for visitors, particularly people with vision impairments as well as children. In this interactive sonification phase, visitors can actively experience an exhibit by using tangible objects to mimic the movement of animals. Sonifications corresponding to the moving tangible objects can be paired with real-time interpretive sonifications produced by the existing Accessible Aquarium system to generate a cooperative fugue. Here, we describe the system configuration, pilot test results, and future works. Implications are discussed in terms of embodied interaction and interactive learning

    Designing interactive sonification for live aquarium exhibits

    No full text
    In response to the need for more accessible and engaging informal learning environments (ILEs), researchers have studied sonification for use in interpretation of live aquarium exhibits. The present work attempts to introduce more interactivity to the project’s existing sonification work, which is expected to lead to more accessible and interactive learning opportunities for visitors, including children and people with vision impairment. In this interactive sonification environment, visitors can actively experience an exhibit by using tangible objects to mimic the movement of animals. Sonifications corresponding to their movement can be paired with real-time animal-based sonifications produced by the existing system to generate a musical fugue. In the current paper, we describe the system configurations, experiment results for optimal sonification parameters and interaction levels, and implications in terms of embodied interaction and interactive learning

    New and forthcoming reference books from Gale Research company

    No full text

    Cytology, Cytogenetics and Plant Breeding

    No full text
    corecore