39,448 research outputs found
How Do Analysts Understand and Verify AI-Assisted Data Analyses?
Data analysis is challenging as it requires synthesizing domain knowledge,
statistical expertise, and programming skills. Assistants powered by large
language models (LLMs), such as ChatGPT, can assist analysts by translating
natural language instructions into code. However, AI-assistant responses and
analysis code can be misaligned with the analyst's intent or be seemingly
correct but lead to incorrect conclusions Therefore, validating AI assistance
is crucial and challenging. Here, we explore how analysts across a range of
backgrounds and expertise understand and verify the correctness of AI-generated
analyses. We develop a design probe that allows analysts to pursue diverse
verification workflows using natural language explanations, code,
visualizations, inspecting data tables, and performing common data operations.
Through a qualitative user study (n=22) using this probe, we uncover common
patterns of verification workflows influenced by analysts' programming,
analysis, and AI backgrounds. Additionally, we highlight open challenges and
opportunities for improving future AI analysis assistant experiences
Scoping analytical usability evaluation methods: A case study
Analytical usability evaluation methods (UEMs) can complement empirical evaluation of systems: for example, they can often be used earlier in design and can provide accounts of why users might experience difficulties, as well as what those difficulties are. However, their properties and value are only partially understood. One way to improve our understanding is by detailed comparisons using a single interface or system as a target for evaluation, but we need to look deeper than simple problem counts: we need to consider what kinds of accounts each UEM offers, and why. Here, we report on a detailed comparison of eight analytical UEMs. These eight methods were applied to it robotic arm interface, and the findings were systematically compared against video data of the arm ill use. The usability issues that were identified could be grouped into five categories: system design, user misconceptions, conceptual fit between user and system, physical issues, and contextual ones. Other possible categories such as User experience did not emerge in this particular study. With the exception of Heuristic Evaluation, which supported a range of insights, each analytical method was found to focus attention on just one or two categories of issues. Two of the three "home-grown" methods (Evaluating Multimodal Usability and Concept-based Analysis of Surface and Structural Misfits) were found to occupy particular niches in the space, whereas the third (Programmable User Modeling) did not. This approach has identified commonalities and contrasts between methods and provided accounts of why a particular method yielded the insights it did. Rather than considering measures such as problem count or thoroughness, this approach has yielded insights into the scope of each method
Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design
The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface
Recommended from our members
Progress and challenges in modelling country-level HIV/AIDS epidemics: the UNAIDS Estimation and Projection Package 2007
The UNAIDS Estimation and Projection Package (EPP) was developed to aid in country-level estimation and short-term projection of HIV/AIDS epidemics. This paper describes advances reflected in the most recent update of this tool (EPP 2007), and identifies key issues that remain to be addressed in future versions. The major change to EPP 2007 is the addition of uncertainty estimation for generalised epidemics using the technique of Bayesian melding, but many additional changes have been made to improve the user interface and efficiency of the package. This paper describes the interface for uncertainty analysis, changes to the user interface for calibration procedures and other user interface changes to improve EPP’s utility in different settings. While formal uncertainty assessment remains an unresolved challenge in low-level and concentrated epidemics, the Bayesian melding approach has been applied to provide analysts in these settings with a visual depiction of the range of models that may be consistent with their data. In fitting the model to countries with longer-running epidemics in sub-Saharan Africa, a number of limitations have been identified in the current model with respect to accommodating behaviour change and accurately replicating certain observed epidemic patterns. This paper discusses these issues along with their implications for future changes to EPP and to the underlying UNAIDS Reference Group model
A review of GIS-based information sharing systems
GIS-based information sharing systems have been implemented in many of England and Wales' Crime and Disorder Reduction Partnerships (CDRPs). The information sharing role of these systems is seen as being vital to help in the review of crime, disorder and misuse of drugs; to sustain strategic objectives, to monitor interventions and initiatives; and support action plans for service delivery. This evaluation into these systems aimed to identify the lessons learned from existing systems, identify how these systems can be best used to support the business functions of CDRPs, identify common weaknesses across the systems, and produce guidelines on how these systems should be further developed. At present there are in excess of 20 major systems distributed across England and Wales. This evaluation considered a representative sample of ten systems. To date, little documented evidence has been collected by the systems that demonstrate the direct impact they are having in reducing crime and disorder, and the misuse of drugs. All point to how they are contributing to more effective partnership working, but all systems must be encouraged to record how they are contributing to improving community safety. Demonstrating this impact will help them to assure their future role in their CDRPs. By reviewing the systems wholly, several key ingredients were identified that were evident in contributing to the effectiveness of these systems. These included the need for an effective partnership business model within which the system operates, and the generation of good quality multi-agency intelligence products from the system. In helping to determine the future development of GIS-based information sharing systems, four key community safety partnership business service functions have been identified that these systems can most effectively support. These functions support the performance review requirements of CDRPs, operate a problem solving scanning and analysis role, and offer an interface with the public. By following these business service functions as a template will provide for a more effective application of these systems nationally
Digital analytics: an approach for data quality control
Internship Report presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Data ScienceWith the emergence of the Digital Era, a new way of analyzing customers' behavior also emerged. It's not only about analyzing data from traditional data warehouses but also about measuring users' digital footprint on websites, mobile applications, and other digital data sources. Nowadays, companies collect data on their digital channels to improve website design and user experience, optimize e-commerce, track, and measure the success of actions and programs, identify problems, and improve the digital channels' performance. But the question that arises is how valid, accurate, and complete the data is. Do digital analysts understand each data point they have at their disposal? In this internship report will be given a detailed view of the critical points of digital analytics data quality, the adjacent problems and a solution will be presented to support and help the digital analysts overcome some of the challenges in this area
Recommended from our members
Guide Me in Analysis: A Framework for Guidance Designers
Guidance is an emerging topic in the field of visual analytics. Guidance can support users in pursuing their analytical goals more efficiently and help in making the analysis successful. However, it is not clear how guidance approaches should be designed and what specific factors should be considered for effective support. In this paper, we approach this problem from the perspective of guidance designers. We present a framework comprising requirements and a set of specific phases designers should go through when designing guidance for visual analytics. We relate this process with a set of quality criteria we aim to support with our framework, that are necessary for obtaining a suitable and effective guidance solution. To demonstrate the practical usability of our methodology, we apply our framework to the design of guidance in three analysis scenarios and a design walk-through session. Moreover, we list the emerging challenges and report how the framework can be used to design guidance solutions that mitigate these issues
- …