11,545 research outputs found
Using visual analytics to develop situation awareness in astrophysics
We present a novel collaborative visual analytics application for cognitively overloaded users in the astrophysics domain. The system was developed for scientists who need to analyze heterogeneous, complex data under time pressure, and make predictions and time-critical decisions rapidly and correctly under a constant influx of changing data. The Sunfall Data Taking system utilizes several novel visualization and analysis techniques to enable a team of geographically distributed domain specialists to effectively and remotely maneuver a custom-built instrument under challenging operational conditions. Sunfall Data Taking has been in production use for 2 years by a major international astrophysics collaboration (the largest data volume supernova search currently in operation), and has substantially improved the operational efficiency of its users. We describe the system design process by an interdisciplinary team, the system architecture and the results of an informal usability evaluation of the production system by domain experts in the context of Endsley's three levels of situation awareness
Analyzing Visual Mappings of Traditional and Alternative Music Notation
In this paper, we postulate that combining the domains of information
visualization and music studies paves the ground for a more structured analysis
of the design space of music notation, enabling the creation of alternative
music notations that are tailored to different users and their tasks. Hence, we
discuss the instantiation of a design and visualization pipeline for music
notation that follows a structured approach, based on the fundamental concepts
of information and data visualization. This enables practitioners and
researchers of digital humanities and information visualization, alike, to
conceptualize, create, and analyze novel music notation methods. Based on the
analysis of relevant stakeholders and their usage of music notation as a mean
of communication, we identify a set of relevant features typically encoded in
different annotations and encodings, as used by interpreters, performers, and
readers of music. We analyze the visual mappings of musical dimensions for
varying notation methods to highlight gaps and frequent usages of encodings,
visual channels, and Gestalt laws. This detailed analysis leads us to the
conclusion that such an under-researched area in information visualization
holds the potential for fundamental research. This paper discusses possible
research opportunities, open challenges, and arguments that can be pursued in
the process of analyzing, improving, or rethinking existing music notation
systems and techniques.Comment: 5 pages including references, 3rd Workshop on Visualization for the
Digital Humanities, Vis4DH, IEEE Vis 201
Wrangling environmental exposure data: guidance for getting the best information from your laboratory measurements.
BACKGROUND:Environmental health and exposure researchers can improve the quality and interpretation of their chemical measurement data, avoid spurious results, and improve analytical protocols for new chemicals by closely examining lab and field quality control (QC) data. Reporting QC data along with chemical measurements in biological and environmental samples allows readers to evaluate data quality and appropriate uses of the data (e.g., for comparison to other exposure studies, association with health outcomes, use in regulatory decision-making). However many studies do not adequately describe or interpret QC assessments in publications, leaving readers uncertain about the level of confidence in the reported data. One potential barrier to both QC implementation and reporting is that guidance on how to integrate and interpret QC assessments is often fragmented and difficult to find, with no centralized repository or summary. In addition, existing documents are typically written for regulatory scientists rather than environmental health researchers, who may have little or no experience in analytical chemistry. OBJECTIVES:We discuss approaches for implementing quality assurance/quality control (QA/QC) in environmental exposure measurement projects and describe our process for interpreting QC results and drawing conclusions about data validity. DISCUSSION:Our methods build upon existing guidance and years of practical experience collecting exposure data and analyzing it in collaboration with contract and university laboratories, as well as the Centers for Disease Control and Prevention. With real examples from our data, we demonstrate problems that would not have come to light had we not engaged with our QC data and incorporated field QC samples in our study design. Our approach focuses on descriptive analyses and data visualizations that have been compatible with diverse exposure studies with sample sizes ranging from tens to hundreds of samples. Future work could incorporate additional statistically grounded methods for larger datasets with more QC samples. CONCLUSIONS:This guidance, along with example table shells, graphics, and some sample R code, provides a useful set of tools for getting the best information from valuable environmental exposure datasets and enabling valid comparison and synthesis of exposure data across studies
Design Fiction Diegetic Prototyping: A Research Framework for Visualizing Service Innovations
The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Purpose: This paper presents a design fiction diegetic prototyping methodology and research framework for investigating service innovations that reflect future uses of new and emerging technologies.
Design/methodology/approach: Drawing on speculative fiction, we propose a methodology that positions service innovations within a six-stage research development framework. We begin by reviewing and critiquing designerly approaches that have traditionally been associated with service innovations and futures literature. In presenting our framework, we provide an example of its application to the Internet of Things (IoT), illustrating the central tenets proposed and key issues identified.
Findings: The research framework advances a methodology for visualizing future experiential service innovations, considering how realism may be integrated into a designerly approach.
Research limitations/implications: Design fiction diegetic prototyping enables researchers to express a range of ‘what if’ or ‘what can it be’ research questions within service innovation contexts. However, the process encompasses degrees of subjectivity and relies on knowledge, judgment and projection.
Practical implications: The paper presents an approach to devising future service scenarios incorporating new and emergent technologies in service contexts. The proposed framework may be used as part of a range of research designs, including qualitative, quantitative and mixed method investigations.
Originality: Operationalizing an approach that generates and visualizes service futures from an experiential perspective contributes to the advancement of techniques that enables the exploration of new possibilities for service innovation research
Recommended from our members
Coordinating visualizations of polysemous action: Values added for grounding proportion
We contribute to research on visualization as an epistemic learning tool by inquiring into the didactical potential of having students visualize one phenomenon in accord with two different partial meanings of the same concept. 22 Grade 4-6 students participated in a design study that investigated the emergence of proportional-equivalence notions from mediated perceptuomotor schemas. Working as individuals or pairs in tutorial clinical interviews, students solved non-symbolic interaction problems that utilized remote-sensing technology. Next, they used symbolic artifacts interpolated into the problem space as semiotic means to objectify in mathematical register a variety of both additive and multiplicative solution strategies. Finally, they reflected on tensions between these competing visualizations of the space. Micro-ethnographic analyses of episodes from three paradigmatic case studies suggest that students reconciled semiotic conflicts by generating heuristic logico-mathematical inferences that integrated competing meanings into cohesive conceptual networks. These inferences hinged on revisualizing additive elements multiplicatively. Implications are drawn for rethinking didactical design for proportions. © 2013 FIZ Karlsruhe
The Unfulfilled Potential of Data-Driven Decision Making in Agile Software Development
With the general trend towards data-driven decision making (DDDM),
organizations are looking for ways to use DDDM to improve their decisions.
However, few studies have looked into the practitioners view of DDDM, in
particular for agile organizations. In this paper we investigated the
experiences of using DDDM, and how data can improve decision making. An emailed
questionnaire was sent out to 124 industry practitioners in agile software
developing companies, of which 84 answered. The results show that few
practitioners indicated a widespread use of DDDM in their current decision
making practices. The practitioners were more positive to its future use for
higher-level and more general decision making, fairly positive to its use for
requirements elicitation and prioritization decisions, while being less
positive to its future use at the team level. The practitioners do see a lot of
potential for DDDM in an agile context; however, currently unfulfilled
- …