Journal of MultiDisciplinary Evaluation (JMDE)
Not a member yet
    470 research outputs found

    In Memoriam - Chris Coryn

    No full text

    The Evaluation of Equity-Focused Community Coalitions: A Review of the Empirical Literature

    Get PDF
    Conducting evaluation and research with community coalitions involved in health equity initiatives is inherently complex. In this paper we provide a review and synthesis of the empirical literature on the evaluation of equity-focused community coalitions. We explore issues, challenges, and barriers experienced by evaluators, as well as techniques and approaches that were considered beneficial. Our review identified 11 peer reviewed articles, from which we identified seven overlapping themes: (1) framing equity in the evaluation process, (2) use of multiple theoretical frameworks, (3) use of systems-focused approaches, (4) strategic use of intersectoral partnerships and collaborations, (5) intentional communication and building trusting relationships, (6) challenges dedicating purposeful time to the work, and (7) issues of cultural and contextual clarity and responsiveness. Our findings point to a significant focus on context, history, learning, communication, relationships, and power. The cultural complexity and historical scope of each context, diversity of stakeholders, and enormity of the systemic issues involved, shape and challenge the evaluation and research process in fundamental ways, requiring a creative and kinetic thinking -- a shifting from methodological certainty to an acknowledged uncertainty, where mixing, blending and the innovative use of approaches and theories becomes a way of moving beyond the colonizing past

    Steps Toward Evaluation as Decluttering: Learnings from Hawaiian Epistemology

    Get PDF
    This paper discusses one of the more contemporary challenges in development and in global health--lots of good ideas from well-meaning insiders and outsiders that end up cluttering both the physical and mental spaces of what can be loosely termed as “attempts” at development. Given the place-based nature of indigenous thought, we turn to Hawaiian epistemology at looking to insights for clarity on how one can negotiate interactions to declutter place and also confuse identity.  We believe that evaluation as a field can help in bringing greater recognition of the need for models of development and learning that respect the importance of de-cluttering.  Implications for a decolonized approach to evaluation are discussed &nbsp

    Decolonizing Evaluation of Indigenous Guidance and Counseling Approaches: A Review of Selected Evaluated Programs

    Get PDF
    The concept of Indigenization of research has been increasingly explored in recent studies, with emphasis placed on the ontological, epistemological, and axiological perspectives of Indigenous peoples to find effective solutions to their challenges. This also applies to the evaluation of guidance and counseling approaches in Africa and other nations, where Indigenous therapies are developed based on different philosophical foundations, such as Ubuntu (Africa). Relational ontologies and epistemologies appear to be common across various Indigenous nations in Africa, Australia, Canada, and North America. This article analyzes studies from these regions on evaluations of Indigenous guidance and counseling therapies. The majority of the evaluations use conventional paradigmatic assumptions in their approach, rather than relational models that are participatory and respectful of participants’ worldviews, including the living, non-living, metaphysical, and spiritual aspects of Indigenous people. However, the Indigenous therapeutic programs analyzed in this study incorporate culturally appropriate activities and curricula that align with relational axioms. This article proposes the use of relational models of evaluation to assess Indigenous counseling programs, where researchers can draw conclusions that align with the cultural contexts of the Indigenous people being researched

    Between Funding Requirements and Community Priorities: Centro Hispano of Dane County’s Transformative Approach to Program Evaluation

    Get PDF
    Evaluation approaches that aim to support large-scale social change need to address neoliberal logic ingrained in the way evaluation has been institutionalized in the US since the early 1900s. Harmful dynamics resulting from evaluation’s institutional history include (1) a focus on accountability and effectiveness, (2) the perpetuation of deficit-based narratives about communities of color, and (3) a top-down approach to program development, in which funders define program goals and assessment criteria and outside academics are hired to provide research services. In consequence, evaluation contributes to the extraction and devaluation of community expertise rather than fostering learning, collaboration, critical reflection, and healing. This article highlights ways of addressing these harmful dynamics through a case study that exemplifies an innovative evaluation approach focused on community strengths and values, healing ethno-racial trauma, and critical consciousness building. We call for funders to rethink their requirements for evaluation and emphasize the need to support evaluation infrastructure, time for critical reflection, and the development of community- and asset-based, culturally responsive evaluation approaches and tools

    The Integration of the Program Evaluation Standards into an Evaluation Toolkit for a Transformative Model of Care for Mental Health Service Delivery

    Get PDF
    Background: Stepped Care 2.0 (SC2.0) is a transformative model of mental health service delivery. This model was created by Stepped Care Solutions (SCS), a not-for-profit consultancy that collaborates with governments, public service organizations, and other institutions that wish to redesign their mental health and addictions systems of care. The SC2.0 model is based on 10 foundational principles and 9 core components that can be flexibly adapted to an organization’s or community’s needs. The model supports groups to reorganize and deliver mental health care in an evidence-informed, person-centric way. SCS partnered with evaluators from the Centre for Health Evaluation and Outcome Sciences (CHÉOS) to create a toolkit that provides evaluation guidance. The toolkit includes a theory of change, guidance on selecting evaluation questions and designs, and an evaluation matrix including suggested process and outcome metrics, all of which can be tailored to each unique implementation of the SC2.0 model. The objective of this resource is to support organizations and communities to conduct high-quality evaluations for the purpose of continuous improvement (a core component of the model of care) and to assess the model’s impact. Purpose: The purpose of this paper is to discuss the integration of the program evaluation standards (PES) into an evaluation toolkit for SC2.0. Setting: In this paper, we describe the toolkit development, focusing on how the PES were embedded in the process and tools. We explore how the integration of the PES into the toolkit supports evaluators to enhance the quality of their evaluation planning, execution, and meta-evaluation. Intervention: Not applicable Research Design: Not applicable Data Collection and Analysis: Not applicable Findings: In this paper, we describe the toolkit development, focusing on how the PES were embedded in the process and tools. We explore how the integration of the PES into the toolkit supports evaluators to enhance the quality of their evaluation planning, execution, and meta-evaluation. Keywords: program evaluation standards; evaluation; mental healt

    Using Dissemination Research Approaches to Understand the Awareness, Adoption, and Use of The Program Evaluation Standards

    Get PDF
    Background: The adoption and use of effective, legally defensible, and ethically sound practices relies on the successful dissemination of evidence-based practices and professional standards. The field of program evaluation has standards, competencies, and principles, yet little is known about how these are utilized by education-focused program evaluators. Purpose: The purpose of this study is to examine the dissemination and use of the program evaluation standards established by the Joint Committee on Standards for Educational Evaluation, relative to the dissemination and use of the American Evaluation Association’s (AEA’s) guiding principles and AEA’s evaluator competencies. Setting: The SIGnetwork, a network of evaluators of State Personnel Development Grants (SPDGs) funded by the U.S. Department of Education, Office for Special Education Programs (OSEP). Intervention: NA Research Design:  Descriptive research. Data Collection and Analysis: Data collection involved administering an online survey to members designated as evaluators in the SIGnetwork directory. Descriptive statistics were used to summarize the data collected via the online survey. Findings: Using the formative audience research approach to understanding dissemination, the results of the study support previous findings that awareness of the standards was inconsistent among a sample of AEA members. Respondents self-reported low to moderate levels of familiarity with The Program Evaluation Standards and the other two guidance documents: Guiding Principles for Evaluators and AEA Evaluator Competencies. Using the audience segmentation research approach to understanding dissemination, the results of this study indicate that participants who were AEA members were more likely than those who were not members of AEA to report being familiar with the standards and to have earned an advanced degree related to their role as an evaluator. Keywords: Joint Committee on Standards for Educational Evaluation, American Evaluation Association, program evaluation standard

    Competencies for Evaluation as a Civic Science

    Get PDF
    This paper relocates the practice of evaluation from its traditional intellectual home in the applied social sciences to an interdisciplinary intellectual community that draws on concepts and practices from civic studies, political science, and studies of coproduction and citizen engagement in public administration and management. It offers an overview of the competencies for evaluation practice once relocated in this way

    Reorienting Evaluator Competencies: Learnings from Evaluation Practice During the COVID-19 Pandemic

    Get PDF
    Background: The Covid-19 Pandemic has emphasized the need for evaluators to reorient their skills based on ongoing learnings from evaluations. As a result, evaluators must possess a variety of competencies to meet the challenges created by such unprecedented circumstances. Purpose: Using existing literature about evaluators' competencies and the experiences of conducting evaluations during pandemic, this paper proposes a set of competencies that enable an evaluator to collaborate meaningfully with grassroots organizations and co-design evaluations with communities that empower evaluators to think beyond the boundaries of a project intervention towards achieving a larger goal. Setting: Covid 19 Pandemic and evaluation practice in India Intervention: NA Research Design: Case study with interpretive approach Data Collection and analysis: NA Findings: The paper suggests that acquiring 'strategic thinking' and 'emotional intelligence and resilience' strengthens evaluators' competencies to be flexible and innovative in evolving evaluation methods

    The Relationship Between Employee Motivation and Evaluation Capacity in a Community-Based Education Organization

    Get PDF
    Background: Evaluation capacity building (ECB) has gained popularity among organizations due to the increased importance of accountability and organizational effectiveness. While the ECB literature has occasionally addressed the notion of motivation, it has usually been in terms of motivation to do or use evaluation (Clinton, 2014; Taylor-Ritzler et al., 2013); this study sought to ascertain whether general overall employee motivation in an organization is itself related to evaluation capacity. By better understanding this relationship, those who are involved in administering, implementing, evaluating, or researching ECB can be better equipped to understand one of the ‘mediating conditions’ or ‘antecedent conditions’ (Cousins et al., 2014) affecting an organization’s ability to do and use evaluation, and, in turn, can more efficiently and effectively craft their ECB work. Purpose: The purpose of this study was to explore the relationship between: (a) employee motivation and individual evaluation capacity; (b) employee motivation and evaluative thinking, and (c) evaluation capacity and evaluative thinking. Setting: The study focused on the Cooperative Extension System, a non-formal community-based education organization linked to public land-grant universities throughout the United States. Specifically, this study drew participants from two state Extension systems, Virginia and Maryland. Intervention: Not applicable. Research Design: This quantitative study used a descriptive correlational design (Creswell, 2003) to uncover the relationship between the variables: motivation and evaluation capacity, motivation and evaluative thinking, and evaluation capacity and evaluative thinking. Data Collection and Analysis: To investigate the relationship between the factors of interest (motivation, evaluation capacity, and evaluative thinking), three instruments were used: the Multidimensional Work Motivation Scale (MWMS), the Evaluation Capacity Assessment Instrument (ECAI), and the Evaluative Thinking Inventory (ETI). STATA MP 13.1 quantitative software was used to analyze the collected data. Findings: Employees with lower overall motivation in doing their work have lower evaluation capacity, and employees with higher motivation which is triggered by no external means but driven by internal factors have higher evaluation capacity

    417

    full texts

    470

    metadata records
    Updated in last 30 days.
    Journal of MultiDisciplinary Evaluation (JMDE) is based in United States
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇