19 research outputs found

    Process evaluation of an intervention to improve HIV treatment outcomes among children and adolescents.

    Get PDF
    SETTING: Children and adolescents with HIV encounter challenges in initiation and adherence to antiretroviral therapy (ART). A community-based support intervention of structured home visits, aimed at improving initiation, adherence and treatment, was delivered by community health workers (CHWs) to children and adolescents newly diagnosed with HIV. OBJECTIVES: To 1) describe intervention delivery, 2) explore CHW, caregiver and adolescents' perceptions of the intervention, 3) identify barriers and facilitators to implementation, and 4) ascertain treatment outcomes at 12 months' post-HIV diagnosis. DESIGN: We drew upon: 1) semi-structured interviews (n = 22) with 5 adolescents, 11 caregivers and 6 CHWs, 2) 28 CHW field manuals, and 3) quantitative data for study participants (demographic information and HIV clinical outcomes). RESULTS: Forty-one children received at least a part of the intervention. Of those whose viral load was tested, 26 (n = 32, 81.3%) were virally suppressed. Interviewees felt that the intervention supported ART adherence and strengthened mental health. Facilitators to intervention delivery were convenience and rapport between CHWs and families. Stigma, challenges in locating participants and inadequate resources for CHWs were barriers. CONCLUSION: This intervention was helpful in supporting HIV treatment adherence among adolescents and children. Facilitators and barriers may be useful in developing future interventions

    Track E Implementation Science, Health Systems and Economics

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/138412/1/jia218443.pd

    Student Participation Index: Student Assessment in Online Courses

    No full text

    HCI education and CHI 97

    No full text

    Is critical thinking happening? Testing content analysis schemes applied to MOOC discussion forums

    No full text
    Learners’ progress within computer‐supported collaborative learning environments is typically measured via analysis and interpretation of quantitative web interaction measures. However, the usefulness of these “proxies for learning” is questioned as they do not necessarily reflect critical thinking—an essential component of collaborative learning. Research indicates that pedagogical content analysis schemes have value in measuring critical discourse in small scale, formal, online learning environments, but research using these methods on high volume, informal, Massive Open Online Course (MOOC) forums is less common. The challenge in this setting is to develop valid and reliable indicators that operate successfully at scale. In this study, we test two established coding schemes used for the pedagogical content analysis of online discussions in a large‐scale review of MOOC comment data. Pedagogical Scores are derived from manual ratings applied to comments by raters and correlated with automatically derived linguistic and interaction indicators. Results show that the content analysis methods are reliable, and are very strongly correlated with each other, suggesting that their specific format is not significant in this setting. In addition, the methods are strongly associated with the relevant linguistic indicators of higher levels of learning and have weaker correlations with other linguistic and interaction metrics. This suggests promise for further research using Machine Learning techniques, with the goal of providing realistic feedback to instructors, learners, and learning designers
    corecore