17 research outputs found

    The Development of Medical Record Items: a User-centered, Bottom-up Approach

    Get PDF
    Objectives: Clinical documents (CDs) have evolved from traditional paper documents containing narrative text information into the electronic record sheets composed of itemized records, where each record is expressed as an item with a specific value. We defined medical record (MR) items to be information entities with a specific value. These entities were then used to compile form-based clinical documents as part of an electronic health record system (EHR-s). Methods: We took a reusable bottom-up developmental approach for the MR items, which provided three things: efficient incorporation of the local needs and requirements of the medical professionals from various departments in the hospital, comprehensive inclusion of the essential concepts of the basic elements required in clinical documents, and the provision of a structured means for meaningful data entry and retrieval. This paper delineates our experiences in developing and managing medical records at a large tertiary university hospital in Korea. Results: We collected 63,232 MR items from paper records scanned into 962 CDs. The MR item database was constructed using 13,287 MR items after removing redundant items. During the first year of service users requested changes to be made to 235 (1.8%) attributes of the MR items and also requested the additional 9,572 new MR items. In the second year, the attributes of 70 (0.5%) of the existing MR items were changed and 3,704 new items were added. The number of registered MR items increased by 72.0 % in the first year and 27.9 % in the second year. Conclusions: The MR item concept provides an easier and more structured means of data entry within an EHR-s. By using these MR items, various kinds of clinical documents can be easily constructed and allows for medical information to be reused and retrieved as data

    Web-based multimodal graphs for visually impaired people

    Get PDF
    This paper describes the development and evaluation of Web-based multimodal graphs designed for visually impaired and blind people. The information in the graphs is conveyed to visually impaired people through haptic and audio channels. The motivation of this work is to address problems faced by visually impaired people in accessing graphical information on the Internet, particularly the common types of graphs for data visualization. In our work, line graphs, bar charts and pie charts are accessible through a force feedback device, the Logitech WingMan Force Feedback Mouse. Pre-recorded sound files are used to represent graph contents to users. In order to test the usability of the developed Web graphs, an evaluation was conducted with bar charts as the experimental platform. The results showed that the participants could successfully use the haptic and audio features to extract information from the Web graphs

    Identifying back pain subgroups: developing and applying approaches using individual patient data collected within clinical trials

    Full text link

    Web servers under overload: How scheduling can help

    No full text
    This article provides a detailed implementation study on the behavior of web serves that serve static requests where the load fluctuates over time (transient overload). Various external factors are considered, including WAN delays and losses and different client behavior models. We find that performance can be dramatically improved via a kernel-level modification to the web server to change the scheduling policy at the server from the standard FAIR (processor-sharing) scheduling to SRPT (shortest-remaining-processing-time) scheduling. We find that SRPT scheduling induces no penalties. In particular, throughput is not sacrificed and requests for long files experience only negligibly higher response times under SRPT than they did under the original FAIR scheduling

    The recognition of web pages' hyperlinks by people with intellectual disabilities : an evaluation study

    No full text
    BACKGROUND: One of the most mentioned problems of web accessibility, as recognized in several different studies, is related to the difficulty regarding the perception of what is or is not clickable in a web page. In particular, a key problem is the recognition of hyperlinks by a specific group of people, namely those with intellectual disabilities. MATERIALS AND METHODS: This experiment investigated a methodology based on the direct observation, video recording, interview and data obtained by an eye tracker device. Ten participants took part in this study. They were divided into two groups and asked to perform two tasks: 'Sing a song' and 'Listen to a story' in two websites. These websites were developed to include specific details. The first website presented an image navigation menu (INM), whereas the other one showed a text navigation menu (TNM). RESULTS: There was a general improvement regarding the participants' performance when using INMs. CONCLUSION: The referred analysis indeed shows that not only did these specific participants gain a better understanding of the demanding task, but also they showed an improved perception concerning the content of the navigation menu that included hyperlinks with images

    Web-based Multimodal Graphs for Visually Impaired People

    No full text
    This paper describes the development and evaluation of Web-based multimodal graphs designed for visually impaired and blind people. The information in the graphs is conveyed to visually impaired people through haptic and audio channels. The motivation of this work is to address problems faced by visually impaired people in accessing graphical information on the Internet, particularly the common types of graphs for data visualization. In our work, line graphs, bar charts and pie charts are accessible through a force feedback device, the Logitech WingMan Force Feedback Mouse. Pre-recorded sound files are used to represent graph contents to users. In order to test the usability of the developed Web graphs, an evaluation was conducted with bar charts as the experimental platform. The results showed that the participants could successfully use the haptic and audio features to extract information from the Web graphs.
    corecore