9 research outputs found

    Combining Pragmatism and Critical Realism in ICT4D Research:An e-Resilience Case Example

    No full text
    Part 1: Pushing the Boundaries - New Research Methods, Theory and Philosophy in ICT4DInternational audienceICT4D research is strongly oriented to practice but hardly ever explicitly uses the research paradigm of pragmatism. We argue that, though highly-relevant to ICT4D, pragmatism suffers some shortcomings in terms of its philosophy of the world, explanatory power, truth-testing, and values. We suggest that “pragmatist-critical realism” – a novel research paradigm combining pragmatism and critical realism – can address these shortcomings and provide a valuable foundation for ICT4D research; particularly action-oriented research. We outline a four-step operational methodology for pragmatist-critical realism based on a research project that created an “e-resilience” action plan applying ICTs to strengthen resilience of farming communities in Uganda. We hope other action- and design-oriented ICT4D researchers will be encouraged to assess whether pragmatist-critical realism could form a useful basis for their future research

    Rethinking ICT4D Impact Assessments: reflections from the Siyakhula Living Lab in South Africa

    No full text
    The approach to outcome and impact assessments of ICTD has often relied solely on identifying project effects in relation to project baseline data; however, such an approach limits the potential learning that could be occurring throughout a project’s lifecycle. Impact assessments should be conducted in a comprehensive manner, taking into account the evaluation data that has been captured from the initiation of the project through to its implementation, and beyond. This study sought to reflect on the implementation of an impact assessment framework that is based on a comprehensive approach to evaluation. The framework was implemented in the Siyakhula Living Lab to assess for its outcomes and impacts on the community. A pragmatic approach was applied through a reflective process, to assess the utility of the framework within this context. Semi-structured interviews with project stakeholders were conducted to further gain insight into the comprehensive approach to conducting impact assessments. It was found that a comprehensive approach to assessing impacts provided a meaningful way to understand the effects of the ICTD initiative, and provided an overview of project areas that required improvement. However, it was found that the proposed assessment framework required a customisation component in order to modify it to better suit the project context. The way in which future impact assessments are conducted can draw on the lessons gained from following a more comprehensive approach to evaluation, and thus improve learning over time

    Multiple and mixed methods in formative evaluation: Is more better? Reflections from a South African study

    Get PDF
    Abstract Background Formative programme evaluations assess intervention implementation processes, and are seen widely as a way of unlocking the ‘black box’ of any programme in order to explore and understand why a programme functions as it does. However, few critical assessments of the methods used in such evaluations are available, and there are especially few that reflect on how well the evaluation achieved its objectives. This paper describes a formative evaluation of a community-based lay health worker programme for TB and HIV/AIDS clients across three low-income communities in South Africa. It assesses each of the methods used in relation to the evaluation objectives, and offers suggestions on ways of optimising the use of multiple, mixed-methods within formative evaluations of complex health system interventions. Methods The evaluation’s qualitative methods comprised interviews, focus groups, observations and diary keeping. Quantitative methods included a time-and-motion study of the lay health workers’ scope of practice and a client survey. The authors conceptualised and conducted the evaluation, and through iterative discussions, assessed the methods used and their results. Results Overall, the evaluation highlighted programme issues and insights beyond the reach of traditional single methods evaluations. The strengths of the multiple, mixed-methods in this evaluation included a detailed description and nuanced understanding of the programme and its implementation, and triangulation of the perspectives and experiences of clients, lay health workers, and programme managers. However, the use of multiple methods needs to be carefully planned and implemented as this approach can overstretch the logistic and analytic resources of an evaluation. Conclusions For complex interventions, formative evaluation designs including multiple qualitative and quantitative methods hold distinct advantages over single method evaluations. However, their value is not in the number of methods used, but in how each method matches the evaluation questions and the scientific integrity with which the methods are selected and implemented
    corecore