Evaluation is dead. Viva evaluation!
In the good old days of a stable, predictable world (no, us neither!), evaluation was the thing that you did towards the end of a project or a business plan to measure how it went against your starting assumptions, or the ones you were given from a funder. Sometimes, that was useful for the next project, so long as the next one was almost an identikit, and so long as the context hadn’t changed. Often it was trapped in a cycle of: make a three-year plan; evaluate it at the end; make another three-year plan. Very often, evaluation was captured by the need to say how well things had been delivered so that you could apply for more money next year (the perception being that nobody wants to fund an organisation where evaluation tells the story of failure, right? No matter how good the failure). There’s a pdf graveyard somewhere with all these evaluations in it.
What, then, does the death of evaluation have to say about what replaces it? The answer turns out to be evaluation, but different…
Developmental evaluation assumes that we live in a complex and uncertain world - one where it is often difficult to disentangle cause and effect and people disagree on what is to be done. In other words, the world is not mechanical. In this kind of world, planning everything up front and then implementing it does not work. We end up with unintended consequences, unsustainable changes and under-performing teams doing things that no longer make sense. Instead, we need to be humble about what we understand, plan for change, experiment, learn and adapt. We call this a design and learning loop and expect to go round it many times over the course of an assignment.
Michael Quinn Patton, who wrote the book on developmental evaluation, suggests five different complex situations where developmental evaluation can be helpful:
Ongoing development in adapting an initiative
Adapting general principles to a new context
Developing a rapid response in the face of a sudden major change or crisis
Pre-formative development of a potentially scalable initiative
Major systems change
Over the last few years, we’ve carried out developmental evaluations in all of these situations. We call it learning and design for systems change. Potayto, potahto? It typically includes:
Developing a shared understanding of what ‘good’ might look like and what we need to learn. We created the Transformational Index to help teams describe the change they want to see in the world, infused with the values that motivate them and the way they do what they do. By naming this up front, we have a framework to guide the developmental evaluation.
Building trusting relationships with partners and playing the role of critical friend. Developmental evaluators tend to build closer relationships with teams, eschewing the pretence that they are impartial observers who have no values or mental models of their own. We want the project to work out and make a difference! We’ll have coffee with you and get to know you as a person because we know that it matters. We’re more likely to tell each other what’s really going on, and we’re more likely to hear each other when we offer feedback.
Creating mechanisms to get rapid feedback and test assumptions. The evaluation has to be based on something beyond conversations over coffee. We’ll figure out ways to gather data, listen to people and measure changes so that we can understand whether an intervention is working. We try to answer questions that are baked into our learning framework to understand whether the assumptions the project is based on hold up. We have a strong preference for quicker feedback loops so that we can learn while it’s still possible to make changes. Good enough and useful tends to win over perfect and too late.
Support programme design. We don’t stop in the learning part of the learning-and-design loop. We love to put on our design thinking hats and help teams consider the options for what to do next - both by facilitating workshops and generating suggestions.
For an example of a developmental evaluation over two and a half years, you’re welcome to take a look at our report on the SAFE Communities project.
Long live evaluation!