592 research outputs found
Recommended from our members
Human-Centered Technologies for Inclusive Collection and Analysis of Public-Generated Data
The meteoric rise in the popularity of public engagement platforms such as social media, customer review websites, and public input solicitation efforts strives for establishing an inclusive environment for the public to share their thoughts, ideas, opinions, and experiences. Many decisions made at a personal, local, or national scale are often fueled by data generated by the public. As such, inclusive collection, analysis, sensemaking, and utilization of pubic-generated data are crucial to support the exercise of successful decision-making processes. However, people often struggle to engage, participate, and share their opinions due to inaccessibility, the rigidity of traditional public engagement methods, and the lack of options to provide opinions while avoiding potential confrontations. Concurrently, data analysts and decision-makers grapple with the challenges of analyzing, sensemaking, and making informed decisions based on public-generated data, which includes high dimensionality, ambiguity present in human language, and a lack of tools and techniques catered to their needs. Novel technological interventions are therefore necessary to enable the public to share their input without barriers and allow decision-makers to capture, forage, peruse, and sublimate public-generated data into concrete and actionable insights.
The goal of this dissertation is to demonstrate how human-centered approaches involve the stakeholders in the design, development, and evaluation of tools and techniques that can lead to inclusive, effective, and efficient approaches to public-generated data collection and analysis to support informed decision-making. To that end, in this dissertation, I first addressed the challenges of empowering the public to share their opinions by exploring two major opinion-sharing avenues --- social media and public consultation. To learn more about people\u27s social media experiences and challenges, I built two technology probes and conducted a qualitative exploratory study with 16 participants. This study is followed up by exploring the challenges of inclusive participation during public consultations such as town halls. Based on a formative study with 66 participants and 20 organizers, I designed and developed CommunityClick to enable reticent share their opinions silently and anonymously during town halls. Equipped with the knowledge and experiences from these works, I designed, developed, and evaluated technologies and methods to facilitate and accelerate informed data-driven decision-making based on increased public-generated data. Based on interviews with 14 analysts and decision-makers in the civic domain, I built a visual analytics system CommunityClick that can facilitate public input analysis by surfacing hidden insights, people\u27s reflections, and priorities. Leveraging the lessons learned during this work, I created a visual text analytics system that supports serendipitous discovery and balanced analysis of textual data to help make informed decisions.
In this work, I contribute an understanding of how people collect and analyze public-generated data to fuel their decisions when they have increased exposure to alternative avenues for opinion-sharing. Through a series of human-centered studies, I highlight the challenges that inhibit inclusivity in opinion sharing and shortcomings of existing methods that prevent decision-makers to account for comprehensive public input that includes marginalized or unpopular opinions. To address these challenges, I designed, developed, and evaluated a collection of interactive systems including CommunityClick, CommunityPulse, and Serendyze. Through a rigorous set of evaluation strategies which include creativity sessions, controlled lab studies, in-the-wild deployment, and field experiments, I involved stakeholders to assess the effectiveness and utility of the built systems. Through the empirical evidence from these studies, I demonstrate how alternative designs for social media could enhance people\u27s social media experiences and enable them to make new connections with others to share opinions. In addition, I show how CommunityClick can be utilized to enable reticent attendees during public consultation to share their opinions while avoiding unwanted confrontation and allowing organizers to capture and account for silent feedback. I highlight how CommunityPulse allowed analysts and decision-makers to examine public input from multiple angles for an accelerated analysis and more informed decision-making. Furthermore, I demonstrate how supporting serendipitous discovery and balanced analysis using Serendyze can lead to more informed data-driven decision-making. I conclude the dissertation with a discussion on future avenues to expand this research including the facilitation of multi-user collaborative analysis, integration of multi-modal signals in the analysis of public-generated data, and potential adoption strategies for decision-support systems designed for inclusive collection and analysis of public-generated data
DAX: Data-Driven Audience Experiences in Esports
Esports(competitivevideogames)havegrownintoaglobalphenomenon with over 450m viewers and a 1.5bn USD market. Esports broadcasts follow a similar structure to traditional sports. However, due to their virtual nature, a large and detailed amount data is available about in-game actions not currently accessible in traditional sport. This provides an opportunity to incorporate novel insights about complex aspects of gameplay into the audience experience – enabling more in-depth coverage for experienced viewers, and increased accessibility for newcomers. Previous research has only explored a limited range of ways data could be incorporated into esports viewing (e.g. data visualizations post-match) and only a few studies have investigated how the presentation of statistics impacts spectators’ experiences and viewing behaviors. We present Weavr, a companion app that allows audiences to consume datadriven insights during and around esports broadcasts. We report on deployments at two major tournaments, that provide ecologically valid findings about how the app’s features were experienced by audiences and their impact on viewing behavior. We discuss implications for the design of second-screen apps for live esports events, and for traditional sports as similar data becomes available for them via improved tracking technologies
Score Reporting Research and Applications
Score reporting research is no longer limited to the psychometric properties of scores and subscores. Today, it encompasses design and evaluation for particular audiences, appropriate use of assessment outcomes, the utility and cognitive affordances of graphical representations, interactive report systems, and more. By studying how audiences understand the intended messages conveyed by score reports, researchers and industry professionals can develop more effective mechanisms for interpreting and using assessment data. Score Reporting Research and Applications brings together experts who design and evaluate score reports in both K-12 and higher education contexts and who conduct foundational research in related areas. The first section covers foundational validity issues in the use and interpretation of test scores; design principles drawn from related areas including cognitive science, human-computer interaction, and data visualization; and research on presenting specific types of assessment information to various audiences. The second section presents real-world applications of score report design and evaluation and of the presentation of assessment information. Across ten chapters, this volume offers a comprehensive overview of new techniques and possibilities in score reporting
Score Reporting Research and Applications
Score reporting research is no longer limited to the psychometric properties of scores and subscores. Today, it encompasses design and evaluation for particular audiences, appropriate use of assessment outcomes, the utility and cognitive affordances of graphical representations, interactive report systems, and more. By studying how audiences understand the intended messages conveyed by score reports, researchers and industry professionals can develop more effective mechanisms for interpreting and using assessment data. Score Reporting Research and Applications brings together experts who design and evaluate score reports in both K-12 and higher education contexts and who conduct foundational research in related areas. The first section covers foundational validity issues in the use and interpretation of test scores; design principles drawn from related areas including cognitive science, human-computer interaction, and data visualization; and research on presenting specific types of assessment information to various audiences. The second section presents real-world applications of score report design and evaluation and of the presentation of assessment information. Across ten chapters, this volume offers a comprehensive overview of new techniques and possibilities in score reporting
Thrive: Success Strategies for the Modern-Day Faculty Member
The THRIVE collection is intended to help faculty thrive in their roles as educators, scholars, researchers, and clinicians. Each section contains a variety of thought-provoking topics that are designed to be easily digested, guide personal reflection, and put into action. Please use the THRIVE collection to help: Individuals study topics on their own, whenever and wherever they want Peer-mentoring or other learning communities study topics in small groups Leaders and planners strategically insert faculty development into existing meetings
Faculty identify campus experts for additional learning, grand rounds, etc. If you have questions or want additional information on a topic, simply contact the article author or email [email protected]://digitalcommons.unmc.edu/facdev_books/1000/thumbnail.jp
Structured Sensemaking of Videographic Information within Dataphoric Space
Attempts to create a structured sensemaking model have proven difficult. Much of the research today has evolved into a cacophony of conceptual models. Many of these sensemaking models have been proposed but not tested. Using structural equations, a unified model of sensemaking was developed and tested. This structured sensemaking model contains five sensemaking constructs: chaos, anchoring, articulation, retrospection, and identity. This model was tested using data collected from 224 educationally focused YouTube videos. The confirmatory factor model developed for this research has a measured Comparative Fit Index of 0.979, a measured Standardized Root Mean Square Residual of 0.078, and a measured Akaike’s Information Criterion of 182.892. The associated structural model has a measured Comparative Fit Index of 0.991, a measured Standardized Root Mean Square Residual of 0.047, and a measured Akaike’s Information Criterion of 131.680. This theory of structured sensemaking supports a) the unification of five sensemaking constructs b) a structured sensemaking framework c) the integration of information theory and d) a reusable sensemaking method. This structured sensemaking framework is the first of its kind
- …