37,978 research outputs found
Web-Based Visualization of Very Large Scientific Astronomy Imagery
Visualizing and navigating through large astronomy images from a remote
location with current astronomy display tools can be a frustrating experience
in terms of speed and ergonomics, especially on mobile devices. In this paper,
we present a high performance, versatile and robust client-server system for
remote visualization and analysis of extremely large scientific images.
Applications of this work include survey image quality control, interactive
data query and exploration, citizen science, as well as public outreach. The
proposed software is entirely open source and is designed to be generic and
applicable to a variety of datasets. It provides access to floating point data
at terabyte scales, with the ability to precisely adjust image settings in
real-time. The proposed clients are light-weight, platform-independent web
applications built on standard HTML5 web technologies and compatible with both
touch and mouse-based devices. We put the system to the test and assess the
performance of the system and show that a single server can comfortably handle
more than a hundred simultaneous users accessing full precision 32 bit
astronomy data.Comment: Published in Astronomy & Computing. IIPImage server available from
http://iipimage.sourceforge.net . Visiomatic code and demos available from
http://www.visiomatic.org
MINDtouch embodied ephemeral transference: Mobile media performance research
This is the post-print version of the final published article that is available from the link below. Copyright @ Intellect Ltd 2011.The aim of the author's media art research has been to uncover any new understandings of the sensations of liveness and presence that may emerge in participatory networked performance, using mobile phones and physiological wearable devices. To practically investigate these concepts, a mobile media performance series was created, called MINDtouch. The MINDtouch project proposed that the mobile videophone become a new way to communicate non-verbally, visually and sensually across space. It explored notions of ephemeral transference, distance collaboration and participant as performer to study presence and liveness emerging from the use of wireless mobile technologies within real-time, mobile performance contexts. Through participation by in-person and remote interactors, creating mobile video-streamed mixes, the project interweaves and embodies a daisy chain of technologies through the network space. As part of a practice-based Ph.D. research conducted at the SMARTlab Digital Media Institute at the University of East London, MINDtouch has been under the direction of Professor Lizbeth Goodman and sponsored by BBC R&D. The aim of this article is to discuss the project research, conducted and recently completed for submission, in terms of the technical and aesthetic developments from 2008 to present, as well as the final phase of staging the events from July 2009 to February 2010. This piece builds on the article (Baker 2008) which focused on the outcomes of phase 1 of the research project and initial developments in phase 2. The outcomes from phase 2 and 3 of the project are discussed in this article
Biosignal and context monitoring: Distributed multimedia applications of body area networks in healthcare
We are investigating the use of Body Area Networks (BANs), wearable sensors and wireless communications for measuring, processing, transmission, interpretation and display of biosignals. The goal is to provide telemonitoring and teletreatment services for patients. The remote health professional can view a multimedia display which includes graphical and numerical representation of patientsâ biosignals. Addition of feedback-control enables teletreatment services; teletreatment can be delivered to the patient via multiple modalities including tactile, text, auditory and visual. We describe the health BAN and a generic mobile health service platform and two context aware applications. The epilepsy application illustrates processing and interpretation of multi-source, multimedia BAN data. The chronic pain application illustrates multi-modal feedback and treatment, with patients able to view their own biosignals on their handheld device
MINDtouch: Embodied mobile media ephemeral transference
Copyright @ 2013 ISAST.This article reviews discoveries that emerged from the author's MINDtouch media research project, in which a mobile device was repurposed for visual and non-verbal communication through gestural and visual mobile expressivity. The work revealed new insights from emerging mobile media and participatory performance practices. The author contextualizes her media research on mobile video and networked performance alongside relevant discourse on presence and the embodiment of technology. From the research, an intimate, phenomenological and visual form of mobile expression has emerged. This form has reconfigured the communication device from voice and text/SMS only to a visual and synesthetic mode for deeper expression
- âŠ