7,471 research outputs found

    Using protocol analysis to explore the creative requirements engineering process

    Full text link
    Protocol analysis is an empirical method applied by researchers in cognitive psychology and behavioural analysis. Protocol analysis can be used to collect, document and analyse thought processes by an individual problem solver. In general, research subjects are asked to think aloud when performing a given task. Their verbal reports are transcribed and represent a sequence of their thoughts and cognitive activities. These verbal reports are analysed to identify relevant segments of cognitive behaviours by the research subjects. The analysis results may be cross-examined (or validated through retrospective interviews with the research subjects). This paper offers a critical analysis of this research method, its approaches to data collection and analysis, strengths and limitations, and discusses its use in information systems research. The aim is to explore the use of protocol analysis in studying the creative requirements engineering process.<br /

    Understanding requirements engineering process: a challenge for practice and education

    Get PDF
    Reviews of the state of the professional practice in Requirements Engineering (RE) stress that the RE process is both complex and hard to describe, and suggest there is a significant difference between competent and "approved" practice. "Approved" practice is reflected by (in all likelihood, in fact, has its genesis in) RE education, so that the knowledge and skills taught to students do not match the knowledge and skills required and applied by competent practitioners. A new understanding of the RE process has emerged from our recent study. RE is revealed as inherently creative, involving cycles of building and major reconstruction of the models developed, significantly different from the systematic and smoothly incremental process generally described in the literature. The process is better characterised as highly creative, opportunistic and insight driven. This mismatch between approved and actual practice provides a challenge to RE education - RE requires insight and creativity as well as technical knowledge. Traditional learning models applied to RE focus, however, on notation and prescribed processes acquired through repetition. We argue that traditional learning models fail to support the learning required for RE and propose both a new model based on cognitive flexibility and a framework for RE education to support this model

    Analytical usability evaluation for digital libraries: A case study

    Get PDF

    Comparison of Interactive Visualization Techniques for Origin-Destination Data Exploration

    Get PDF
    Origin-Destination (OD) data is a crucial part of price estimation in the aviation industry, and an OD flight is any number of flights a passenger takes in a single journey. OD data is a complex set of data that is both flow and multidimensional type of data. In this work, the focus is to design interactive visualization techniques to support user exploration of OD data. The thesis work aims to find which of the two menu designs suit better for OD data visualization: breadth-first or depth-first menu design. The two menus follow Schneidermanā€™s Task by Data Taxonomy, a broader version of the Information Seeking Mantra. The first menu design is a parallel, breadth-first menu layout. The layout shows the variables in an open layout and is closer to the original data matrix. The second menu design is a hierarchical, depth-first layout. This layout is derived from the semantics of the data and is more compact in terms of screen space. The two menu designs are compared in an online survey study conducted with the potential end users. The results of the online survey study are inconclusive, and therefore are complemented with an expert review. Both the survey study and expert review show that the Sankey graph is a good visualization type for this work, but the interaction of the two menu designs requires further improvements. Both of the menu designs received positive and negative feedback in the expert review. For future work, a solution that combines the positives of the two designs could be considered. ACM Computing Classification System (CCS): Human-Centered Computing ā†’ Visualization ā†’ Empirical Studies in Visualization Human-centered computing ā†’ Interaction design ā†’ Interaction design process and methods ā†’ Interface design prototypin

    A Work System Front End for Object-Oriented Analysis and Design

    Get PDF
    This paper proposes that basic ideas from the work system theory (WST) and the work system method (WSM) might serve as a front end to object-oriented analysis and design (OOAD), thereby providing a path from business-oriented descriptions to formal, technical specifications. After describing the background motivation and summarizing work system concepts, the paper uses a hiring system example to show how two tools from WSM can be used as a front end for OOAD, in effect, a step before creating use case diagrams and other types of Unified Modeling Language (UML) artifacts. Potential benefits of this approach stem from a business-oriented question, how can we improve this work system\u27s performance, rather than an IT-oriented question, how can we create a technical artifact that will be used

    Characterizing High School Students\u27 Systems Thinking in Engineering Design Through the Function-Behavior-Structure (FBS) Framework

    Get PDF
    The aim of this research study was to examine high school students\u27 systems thinking when engaged in an engineering design challenge. This study included 12 high school students that were paired into teams of two to work through an engineering design challenge. These dyads were given one hour in their classrooms with access to a computer and engineering sketching paper to complete the design. Immediately following the design challenge, the students participated in a post hoc reflective group interview. The methodology of this study was informed by and derived from cognitive science\u27s verbal protocol analysis. Multiple forms of data were gathered and triangulated for analysis. These forms included audio and video recordings of the design challenge and the interview, computer tracking, and student-generated sketches. The data were coded using Gero\u27s FBS framework. These coded data were analyzed using descriptive statistics. The transitions were further analyzed using measures of centrality. Additionally, qualitative analysis techniques were used to understand and interpret systems and engineering design themes and findings. Through the qualitative and quantitative analyses, it was shown that the students demonstrated thinking in terms of systems. The results imply that systems thinking can be part of a high school engineering curriculum. The students considered and explored multiple interconnected variables, both technical as well as nontechnical in nature. The students showed further systems thinking by optimizing their design through balancing trade-offs of nonlinear interconnected variables. Sketching played an integral part in the students\u27 design process, as it was used to generate, develop, and communicate their designs. Although many of the students recognized their own lack of drawing abilities, they understood the role sketching played in engineering design. Therefore, graphical visualization through sketching is a skill that educators may want to include in their curricula. The qualitative analysis also shed light on analogical reasoning. The students drew from their personal experience in lieu of professional expertise to better understand and expand their designs. Hence, the implication for educators is to aid the students in using their knowledge, experience, and preexisting schemata to work through an engineering design

    Doctor of Philosophy

    Get PDF
    dissertationThis dissertation establishes a new visualization design process model devised to guide visualization designers in building more effective and useful visualization systems and tools. The novelty of this framework includes its flexibility for iteration, actionability for guiding visualization designers with concrete steps, concise yet methodical definitions, and connections to other visualization design models commonly used in the field of data visualization. In summary, the design activity framework breaks down the visualization design process into a series of four design activities: understand, ideate, make, and deploy. For each activity, the framework prescribes a descriptive motivation, list of design methods, and expected visualization artifacts. To elucidate the framework, two case studies for visualization design illustrate these concepts, methods, and artifacts in real-world projects in the field of cybersecurity. For example, these projects employ user-centered design methods, such as personas and data sketches, which emphasize our teams' motivations and visualization artifacts with respect to the design activity framework. These case studies also serve as examples for novice visualization designers, and we hypothesized that the framework could serve as a pedagogical tool for teaching and guiding novices through their own design process to create a visualization tool. To externally evaluate the efficacy of this framework, we created worksheets for each design activity, outlining a series of concrete, tangible steps for novices. In order to validate the design worksheets, we conducted 13 student observations over the course of two months, received 32 online survey responses, and performed a qualitative analysis of 11 in-depth interviews. Students found the worksheets both useful and effective for framing the visualization design process. Next, by applying the design activity framework to technique-driven and evaluation-based research projects, we brainstormed possible extensions to the design model. Lastly, we examined implications of the design activity framework and present future work in this space. The visualization community is challenged to consider how to more effectively describe, capture, and communicate the complex, iterative nature of data visualization design throughout research, design, development, and deployment of visualization systems and tools
    • ā€¦
    corecore