64,564 research outputs found
An investigation into the validation of formalised cognitive dimensions
The cognitive dimensions framework is a conceptual framework
aimed at characterising features of interactive systems that are strongly influential upon their effective use. As such the framework facilitates the critical assessment and design of a wide variety of information artifacts. Although the framework has proved to be of considerable interest to researchers and practitioners, there has been little research examining how easily the dimensions used by it can be consistently applied. The work reported in this paper addresses this
problem by examining an approach to the systematic application of dimensions and assessing its success empirically. The findings demonstrate a relatively successful approach to validating the systematic application of some concepts found in the cognitive dimensions framework.</p
Intangible trust requirements - how to fill the requirements trust "gap"?
Previous research efforts have been expended in terms of the capture and subsequent instantiation of "soft" trust requirements that relate to HCI usability concerns or in relation to "hard" tangible security requirements that primarily relate to security a ssurance and security protocols. Little direct focus has been paid to managing intangible trust related requirements
per se. This 'gap' is perhaps most evident in the public B2C (Business to Consumer) E- Systems we all use on a daily basis. Some speculative suggestions are made as to how to fill the 'gap'.
Visual card sorting is suggested as a suitable evaluative tool; whilst deontic logic trust norms
and UML extended notation are the suggested (methodologically invariant) means by which software development teams can perhaps more fully capture hence visualize intangible trust requirements
Developing a valid method to study adaptive behaviours with regard to IEQ in primary schools
Adaptive behaviour impacts the classroom's environment and the student's comfort. Therefore, a deep understanding of students' adaptive behaviour is required. This study aims to develop a valid and reliable method to realize how children in their late middle childhood (9ā11) practise adaptive behaviours as a response to the classroom's Indoor Environmental Quality (IEQ). A self-reported questionnaire accompanied with an observation form is designed based on children's āhere and nowā sensations, their cognitive and linguistic competence. Validity and reliability of the questionnaire were tested by running pilot and field studies in eight primary schools from July 2017 to May 2018. Through transverse sampling, 805 children were observed, and 1390 questionnaires were collected in 31 classrooms. Questions and responses of the designed questionnaire were validated by monitoring answer-process, non-participant observations, cross-checking questions and statistical tests. Validating process improved the wording of the questions and response categories and resulted in a questionnaire with a high and valid response rate. The reliability of the questionnaire was tested by measuring the variability and standard deviations of responses under similar conditions. To conclude, the study introduces a questionnaire and an observation form that should be used together to provide a valid and reliable method for studying adaptive behaviour of primary school children
Honeywell Enhancing Airplane State Awareness (EASA) Project: Final Report on Refinement and Evaluation of Candidate Solutions for Airplane System State Awareness
The loss of pilot airplane state awareness (ASA) has been implicated as a factor in several aviation accidents identified by the Commercial Aviation Safety Team (CAST). These accidents were investigated to identify precursors to the loss of ASA and develop technologies to address the loss of ASA. Based on a gap analysis, two technologies were prototyped and assessed with a formative pilot-in-the-loop evaluation in NASA Langleys full-motion Research Flight Deck. The technologies address: 1) data source anomaly detection in real-time, and 2) intelligent monitoring aids to provide nominal and predictive awareness of situations to be monitored and a mission timeline to visualize events of interest. The evaluation results indicated favorable impressions of both technologies for mitigating the loss of ASA in terms of operational utility, workload, acceptability, complexity, and usability. The team concludes that there is a feasible retrofit solution for improving ASA that would minimize certification risk, integration costs, and training impact
A Nine Month Report on Progress Towards a Framework for Evaluating Advanced Search Interfaces considering Information Retrieval and Human Computer Interaction
This is a nine month progress report detailing my research into supporting users in their search for information, where the questions, results or even thei
Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu
Predicting operator workload during system design
A workload prediction methodology was developed in response to the need to measure workloads associated with operation of advanced aircraft. The application of the methodology will involve: (1) conducting mission/task analyses of critical mission segments and assigning estimates of workload for the sensory, cognitive, and psychomotor workload components of each task identified; (2) developing computer-based workload prediction models using the task analysis data; and (3) exercising the computer models to produce predictions of crew workload under varying automation and/or crew configurations. Critical issues include reliability and validity of workload predictors and selection of appropriate criterion measures
Recommended from our members
Designing Interactive Graphics for Validating and Interpreting Storm Track Model Outputs
We report on some initial work in which we designed interactive graphics to help climate scientists identify and extract good examples of simulated storm-tracks from a large dataset to help disseminate information to various audiences. A side-effect of this work was that the exploratory potential offered by the interactive graphics helped our climate scientist coauthors validate and interpret their data in a way that was not previously possible for them. We are extending this work to provide support for a wider range of validation and interpretative tasks, with a focus on answering questions of relevance to the insurance industry. We describe our collaborative approach, that draws on ideas from āpatchwork prototypingā [2, 5] in which a rapid iterative process of design, implementation and testing, is used to help provide the functionality to support a set of āuser storiesā
- ā¦