17 research outputs found

    Assessment of the quality and variability of health information on chronic pain websites using the DISCERN instrument

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Internet is used increasingly by providers as a tool for disseminating pain-related health information and by patients as a resource about health conditions and treatment options. However, health information on the Internet remains unregulated and varies in quality, accuracy and readability. The objective of this study was to determine the quality of pain websites, and explain variability in quality and readability between pain websites.</p> <p>Methods</p> <p>Five key terms (pain, chronic pain, back pain, arthritis, and fibromyalgia) were entered into the Google, Yahoo and MSN search engines. Websites were assessed using the DISCERN instrument as a quality index. Grade level readability ratings were assessed using the Flesch-Kincaid Readability Algorithm. Univariate (using alpha = 0.20) and multivariable regression (using alpha = 0.05) analyses were used to explain the variability in DISCERN scores and grade level readability using potential for commercial gain, health related seals of approval, language(s) and multimedia features as independent variables.</p> <p>Results</p> <p>A total of 300 websites were assessed, 21 excluded in accordance with the exclusion criteria and 110 duplicate websites, leaving 161 unique sites. About 6.8% (11/161 websites) of the websites offered patients' commercial products for their pain condition, 36.0% (58/161 websites) had a health related seal of approval, 75.8% (122/161 websites) presented information in English only and 40.4% (65/161 websites) offered an interactive multimedia experience. In assessing the quality of the unique websites, of a maximum score of 80, the overall average DISCERN Score was 55.9 (13.6) and readability (grade level) of 10.9 (3.9). The multivariable regressions demonstrated that website seals of approval (<it>P </it>= 0.015) and potential for commercial gain (<it>P </it>= 0.189) were contributing factors to higher DISCERN scores, while seals of approval (<it>P </it>= 0.168) and interactive multimedia (<it>P </it>= 0.244) contributed to lower grade level readability, as indicated by estimates of the beta coefficients.</p> <p>Conclusion</p> <p>The overall quality of pain websites is moderate, with some shortcomings. Websites that scored high using the DISCERN questionnaire contained health related seals of approval and provided commercial solutions for pain related conditions while those with low readability levels offered interactive multimedia options and have been endorsed by health seals.</p

    Evaluating Written Patient Information for Eczema in German: Comparing the Reliability of Two Instruments, DISCERN and EQIP

    Get PDF
    Patients actively seek information about how to cope with their health problems, but the quality of the information available varies. A number of instruments have been developed to assess the quality of patient information, primarily though in English. Little is known about the reliability of these instruments when applied to patient information in German. The objective of our study was to investigate and compare the reliability of two validated instruments, DISCERN and EQIP, in order to determine which of these instruments is better suited for a further study pertaining to the quality of information available to German patients with eczema. Two independent raters evaluated a random sample of 20 informational brochures in German. All the brochures addressed eczema as a disorder and/or therapy options and care. Intra-rater and inter-rater reliability were assessed by calculating intra-class correlation coefficients, agreement was tested with weighted kappas, and the correlation of the raters’ scores for each instrument was measured with Pearson’s correlation coefficient. DISCERN demonstrated substantial intra- and inter-rater reliability. It also showed slightly better agreement than EQIP. There was a strong correlation of the raters’ scores for both instruments. The findings of this study support the reliability of both DISCERN and EQIP. However, based on the results of the inter-rater reliability, agreement and correlation analyses, we consider DISCERN to be the more precise tool for our project on patient information concerning the treatment and care of eczema

    An evaluation of the content and quality of tinnitus information on websites preferred by General Practitioners

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Tinnitus is a prevalent and complex medical complaint often co-morbid with stress, anxiety, insomnia, depression, and cognitive or communication difficulties. Its chronicity places a major burden on primary and secondary healthcare services. In our recent national survey of General Practitioners (GPs) from across England, many reported that their awareness of tinnitus was limited and as a result were dissatisfied with the service they currently provide. GPs identified 10 online sources of information they currently use in clinical practice, but welcomed further concise and accurate information on tinnitus assessment and management. The purpose of this study was to assess the content, reliability, and quality of the information related to primary care tinnitus assessment and management on these 10 websites.</p> <p>Methods</p> <p>Tinnitus related content on each website was assessed using a summative content analysis approach. Reliability and quality of the information was assessed using the DISCERN questionnaire.</p> <p>Results</p> <p>Quality of information was rated using the validated DISCERN questionnaire. Significant inter-rater reliability was confirmed by Kendall’s coefficient of concordance (<it>Wt</it>) which ranged from 0.48 to 0.92 across websites. The website Map of Medicine achieved the highest overall DISCERN score. However, for information on treatment choice, the British Tinnitus Association was rated best. Content analysis revealed that all websites lacked a number of details relating to either tinnitus assessment or management options.</p> <p>Conclusions</p> <p>No single website provides comprehensive information for GPs on tinnitus assessment and management and so GPs may need to refer to more than one if they want to maximise their coverage of the topic. From those preferred by GPs we recommend several specific websites as the current ‘best’ sources. Our findings should guide healthcare website providers to improve the quality and inclusiveness of the information they publish on tinnitus. In the case of one website, our preliminary findings are already doing so. Such developments will in turn help facilitate best practice in primary care.</p
    corecore