3,991,927 research outputs found
Building Evaluation Muscle: Practical Steps for Health Nonprofits at Any Stage
This paper bridges the academic literature and ordinary practice to show how nonprofit organizations, regardless of where they are on the spectrum of evaluation capacity, and regardless of their desire to conduct evaluation internally or use external consultants, can strengthen their ability to engage in and sustain an ongoing evaluation practice. These suggestions are not exhaustive; but they are meant to be practical, accessible, and realistically doable for most nonprofits
Putting theory oriented evaluation into practice
Evaluations of gaming simulations and business games as teaching devices are typically end-state driven. This emphasis fails to detect how the simulation being evaluated does or does not bring about its desired consequences. This paper advances the use of a logic model approach which possesses a holistic perspective that aims at including all elements associated with the situation created by a game. The use of the logic model approach is illustrated as applied to Simgame, a board game created for secondary school level business education in six European Union countries
Recommended from our members
Rigour (-mortis) in evaluation
Evaluation-in-practice can be regarded as a confluence of interactions between three broad idealised sets of stakeholders – the evaluand, evaluators, and commissioners of evaluations. Elsewhere I have suggested two contrasting manifestations in which these interactions might be expressed; one as an ‘evaluation-industrial complex’ (similar in form to the ‘military-industrial complex’ originally used by Dwight Eisenhower in 1961), and another as a more benign ‘evaluation-adaptive complex’ (Reynolds, 2015).
Building on the idea of an iron triangle that empowers the military-industrial complex, I represented the relationships of evaluation-in-practice as a triadic interplay involving six activities that influence the evaluation process. Here I focus on only one of the six activities – commissioning – and I summarise what it might look like for an evaluation-adaptive complex
Recommended from our members
Framing purposeful evaluation through critical systems thinking
Two traditions of practice – evaluation and systems – share three significant concerns regarding development intervention: (i) making sense of complex interrelationships and the continual change brought about by such relationships; (ii) engaging with multiple (including exogenous and endogenous), often conflicting, perspectives on situations; and (iii) challenging vicious cycles of practice and understanding by cultivating a more radical learning culture. These challenges might be described successively in terms of cultivating a shift from (i) summative to formative evaluation (ii) positional bargaining to interest based negotiation, and (iii) purposive to purposeful action. Some ideas from traditions of social learning and critical systems thinking are presented to support a re-framing of intervention and evaluation from one serving the 'project state' towards one serving more radical transformative practice
Global Innovations in Measurement and Evaluation
We researched the latest developments in theory and practice in measurement and evaluation. And we found that new thinking, techniques, and technology are influencing and improving practice. This report highlights 8 developments that we think have the greatest potential to improve evaluation and programme design, and the careful collection and use of data. In it, we seek to inform and inspire—to celebrate what is possible, and encourage wider application of these ideas
Using change management theory to implement the international classification of functioning, disability and health (ICF) in clinical practice
Practice Evaluation
The success of the International Classification of Functioning, Disability and
Health (ICF) depends on its implementation in practice. This article describes
an evaluation of the introduction of the ICF framework into an occupational
therapy service. Reflections from the working party responsible for its
introduction were related to change management theory. The experiences
throughout the implementation project could be mapped to an eight-stage
process of creating major change (Kotter 1996). The working party concluded
that the explicit use of and closer adherence to change management theory
could enhance the uptake of the ICF in clinical practice. Further exploratory
research is required to support these reflections
Evaluation Capacity Building: Funder Initiatives to Strengthen Grantee Evaluation Capacity and Practice
Innovation Network has a wealth of experience as an evaluation capacity builder. One way we often provide ECB services is to a group of grantees. Over the years we have provided funder-sponsored ECB services to a number of grantee cohorts. The following three case studies share our experiences with funder initiatives to strengthen grantee evaluation capacity and practice. Each case study includes a description of the overall grantmaking initiative, followed by a discussion of the ECB services requested by grantees and/or provided by Innovation Network. The paper concludes with a reflection on lessons learned and recommendations for funders considering ECB for their grantee
Learning lessons from evaluating eGovernment: Reflective case experiences that support transformational government
Central Government strategy of e-inclusion is being manifested in the form of eGovernment. Given that it is the public
purse that funds such investments, there is increasingly attention being paid to the evaluation of these investments, such
that value for money and organisation learning can be realised. In this paper the authors report the findings from three
interpretive in-depth organisational case studies that explore eGovernment evaluation within a UK public sector setting.
The paper elicits insights to organisational and managerial aspects with the purpose of improving knowledge and understanding
of eGovernment evaluation. The findings that are extrapolated from the case study analysis are presented in terms
of lessons that gravitate around social factors, evaluation, adoption, ownership, prioritisation sponsorship and, responsibility.
These lessons are extrapolated from the empirical enquiry to improve eGovernment evaluation practice. The paper concludes
that eGovernment evaluation is an under developed area, with most work being developmental in nature and as a
result calls for decision makers to engage with the eGovernment agenda and commission eGovernment evaluation exercises
to improve evaluation practice such that transformational Government can realise its full potential. The paper ends
by highlighting political, economic, technical and social issues as the drivers of the evaluation cycle
Utility-Based Evaluation of Adaptive Systems
The variety of user-adaptive hypermedia systems available calls for methods of comparison. Layered evaluation techniques appear to be useful for this purpose. In this paper we present a utility-based evaluation approach that is based on these techniques. Issues that arise when putting utility-based evaluation into practice are dealt with. We also explain the need for interpretative user models and common sets of evaluation criteria for different domains
- …
