109,385 research outputs found

    Evaluation of mobile health education applications for health professionals and patients

    Get PDF
    Paper presented at 8th International conference on e-Health (EH 2016), 1-3 July 2016, Funchal, Madeira, Portugal. ABSTRACT Mobile applications for health education are commonly utilized to support patients and health professionals. A critical evaluation framework is required to ensure the usability and reliability of mobile health education applications in order to facilitate the saving of time and effort for the various user groups; thus, the aim of this paper is to describe a framework for evaluating mobile applications for health education. The intended outcome of this framework is to meet the needs and requirements of the different user categories and to improve the development of mobile health education applications with software engineering approaches, by creating new and more effective techniques to evaluate such software. This paper first highlights the importance of mobile health education apps, then explains the need to establish an evaluation framework for these apps. The paper provides a description of the evaluation framework, along with some specific evaluation metrics: an efficient hybrid of selected heuristic evaluation (HE) and usability evaluation (UE) factors to enable the determination of the usefulness and usability of health education mobile apps. Finally, an explanation of the initial results for the framework was obtained using a Medscape mobile app. The proposed framework - An Evaluation Framework for Mobile Health Education Apps – is a hybrid of five metrics selected from a larger set in heuristic and usability evaluation, filtered based on interviews from patients and health professionals. These five metrics correspond to specific facets of usability identified through a requirements analysis of typical users of mobile health apps. These metrics were decomposed into 21 specific questionnaire questions, which are available on request from first author

    Embedding accessibility and usability: considerations for e-learning research and development projects

    Get PDF
    This paper makes the case that if e‐learning research and development projects are to be successfully adopted in real‐world teaching and learning contexts, then they must effectively address accessibility and usability issues; and that these need to be integrated throughout the project. As such, accessibility and usability issues need to be made explicit in project documentation, along with allocation of appropriate resources and time. We argue that accessibility and usability are intrinsically inter‐linked. An integrated accessibility and usability evaluation methodology that we have developed is presented and discussed. The paper draws on a series of mini‐case studies from e‐learning projects undertaken over the past 10 years at the Open University

    Designing and evaluating the usability of a machine learning API for rapid prototyping music technology

    Get PDF
    To better support creative software developers and music technologists' needs, and to empower them as machine learning users and innovators, the usability of and developer experience with machine learning tools must be considered and better understood. We review background research on the design and evaluation of application programming interfaces (APIs), with a focus on the domain of machine learning for music technology software development. We present the design rationale for the RAPID-MIX API, an easy-to-use API for rapid prototyping with interactive machine learning, and a usability evaluation study with software developers of music technology. A cognitive dimensions questionnaire was designed and delivered to a group of 12 participants who used the RAPID-MIX API in their software projects, including people who developed systems for personal use and professionals developing software products for music and creative technology companies. The results from the questionnaire indicate that participants found the RAPID-MIX API a machine learning API which is easy to learn and use, fun, and good for rapid prototyping with interactive machine learning. Based on these findings, we present an analysis and characterization of the RAPID-MIX API based on the cognitive dimensions framework, and discuss its design trade-offs and usability issues. We use these insights and our design experience to provide design recommendations for ML APIs for rapid prototyping of music technology. We conclude with a summary of the main insights, a discussion of the merits and challenges of the application of the CDs framework to the evaluation of machine learning APIs, and directions to future work which our research deems valuable

    A hybrid evaluation approach and guidance for mHealth education applications

    Get PDF
    © Springer International Publishing AG 2018. Mobile health education applications (MHEAs) are used to support different users. However, although these applications are increasing in number, there is no effective evaluation framework to measure their usability and thus save effort and time for their many user groups. This paper outlines a useful framework for evaluating MHEAs, together with particular evaluation metrics: an efficient hybrid of selected heuristic evaluation (HE) and usability evaluation (UE) factors to enable the determination of the usefulness and usability of MHEAs. We also propose a guidance tool to help stakeholders choose the most suitable MHEA. The outcome of this framework is envisioned as meeting the requirements of different users, in addition to enhancing the development of MHEAs using software engineering approaches by creating new and more effective evaluation techniques. Finally, we present qualitative and quantitative results for the framework when used with MHEAs

    Heuristic Evaluation for Serious Immersive Games and M-instruction

    Get PDF
    © Springer International Publishing Switzerland 2016. Two fast growing areas for technology-enhanced learning are serious games and mobile instruction (M-instruction or M-Learning). Serious games are ones that are meant to be more than just entertainment. They have a serious use to educate or promote other types of activity. Immersive Games frequently involve many players interacting in a shared rich and complex-perhaps web-based-mixed reality world, where their circumstances will be multi and varied. Their reality may be augmented and often self-composed, as in a user-defined avatar in a virtual world. M-instruction and M-Learning is learning on the move; much of modern computer use is via smart devices, pads, and laptops. People use these devices all over the place and thus it is a natural extension to want to use these devices where they are to learn. This presents a problem if we wish to evaluate the effectiveness of the pedagogic media they are using. We have no way of knowing their situation, circumstance, education background and motivation, or potentially of the customisation of the final software they are using. Getting to the end user itself may also be problematic; these are learning environments that people will dip into at opportune moments. If access to the end user is hard because of location and user self-personalisation, then one solution is to look at the software before it goes out. Heuristic Evaluation allows us to get User Interface (UI) and User Experience (UX) experts to reflect on the software before it is deployed. The effective use of heuristic evaluation with pedagogical software [1] is extended here, with existing Heuristics Evaluation Methods that make the technique applicable to Serious Immersive Games and mobile instruction (M-instruction). We also consider how existing Heuristic Methods may be adopted. The result represents a new way of making this methodology applicable to this new developing area of learning technology

    A guidance and evaluation approach for mHealth education applications

    Get PDF
    © Springer International Publishing AG 2017. A growing number of mobile applications for health education are being utilized to support different stakeholders, from health professionals to software developers to patients and more general users. There is a lack of a critical evaluation framework to ensure the usability and reliability of these mobile health education applications (MHEAs). Such a framework would facilitate the saving of time and effort for the different user groups. This paper describes a framework for evaluating mobile applications for health education, including a guidance tool to help different stakeholders select the one most suitable for them. The framework is intended to meet the needs and requirements of the different user categories, as well as improving the development of MHEAs through software engineering approaches. A description of the evaluation framework is provided, with its efficient hybrid of selected heuristic evaluation (HE) and usability evaluation (UE) factors. Lastly, an account of the quantitative and qualitative results for the framework applied to the Medscape and other mobile apps is given. This proposed framework - an Evaluation Framework for Mobile Health Education Apps - consists of a hybrid of five metrics selected from a larger set during heuristic and usability evaluation, the choice being based on interviews with patients, software developers and health professionals

    Evaluating complex digital resources

    Get PDF
    Squires (1999) discussed the gap between HCI (Human Computer Interaction) and the educational computing communities in their very different approaches to evaluating educational software. This paper revisits that issue in the context of evaluating digital resources, focusing on two approaches to evaluation: an HCI and an educational perspective. Squires and Preece's HCI evaluation model is a predictive model ‐ it helps teachers decide whether or not to use educational software ‐ whilst our own concern is in evaluating the use of learning technologies. It is suggested that in part the different approaches of the two communities relate to the different focus that each takes: in HCI the focus is typically on development and hence usability, whilst in education the concern is with the learner and teacher use

    Mobile Application Usability: Heuristic Evaluation and Evaluation of Heuristics

    Get PDF
    Ger Joyce, Mariana Lilley, Trevor Barker, and Amanda Jefferies, 'Mobile Application Usability: Heuristic Evaluation and Evaluation of Heuristics', paper presented at AHFE 2016 International Conference on Human Factors, Software, and Systems Engineering. Walt Disney World, Florida USA, 27-31 July 2016Many traditional usability evaluation methods do not consider mobile-specific issues. This can result in mobile applications that abound in usability issues. We empirically evaluate three sets of usability heuristics for use with mobile applications, including a set defined by the authors. While the set of heuristics defined by the authors surface more usability issues in a mobile application than other sets of heuristics, improvements to the set can be made
    • 

    corecore