This is the second part of my Developmental Evaluation (DE) training workshop report. You can read part 1 here.
The Developmental Evaluation approach to scaling
Don’t just do a report-back and ask questions; by then it’s too late
– Kate McKegg
One characteristic of DE’s role in supporting development is that it seeks to adapt effective principles to new contexts – there’s no intention to create a fixed/standardised model: “in complex interventions, traditional concepts about scaling don’t hold”.
DE is a tricky paradigm shift for everybody. You need to be prepared to have tough and courageous conversations. It’s not for the faint-hearted.
“The very techniques that enable evaluation excellence in more static situations – standardisation of inputs, consistency of treatment, uniformity of outcomes and clarity of causal linkages – are unhelpful, even harmful, to situations where there us a lot of uncertainty and moving goalposts” Jamie Gamble, A Development Evaluation Primer
DE aims to be a middle-way synthesis between top-down and bottom-up change.
What tools does DE make use of?
Of course, a wide range of tools can be used in DE, depending on the needs of the situation. One tool may well be familiar to Q members who are already using Liberating Structures (see more about how they have been using them).
“I’m starting to use more Liberating Structures. ‘What, So What, Now What?’ is a favourite – it’s the DE cycle,” said Kate.
Right at the outset of the workshop Kate used a fabulous Maori-derived exercise for deepening our introductions (taken from Paraire Huata’s Te Ngaru Learning Systems). We looked at the attitudes towards risk taking taken by our grandparents, parents and that we see, or would like to see, in our children and grandchildren. This exercise enables you to go deep very quickly, and is helpful before high stakes work.
In the workshop we used Rich Pictures from Peter Checkland’s Soft Systems Methodology to map out the relationships, structures, processes etc involved in UK homelessness. They’re a great tool for very quickly unpacking the situation, situating where people are in the system. “People are usually in their roles, but when they draw a picture it embodies a whole lot more, more than they’d tell you. It taps into another part of the brain,” Kate said.
DE tends to rely heavily on visuals, diagrams and stories to make sense of the unfolding innovation. Other common tools are Journey Mapping, Appreciative Inquiry, Social Network Analysis, Simulations, pattern spotting and digital storytelling, helpful when there is less data.
There’s also a shift in emphasis from the Logic Model/Theory of Change as a static instrument. Instead it will be repeatedly revised and even rebuilt from scratch.
“People want to hang on to the Theory of Change; it becomes the thing. It’s not the boxes!” says Kate.
One thing that is a lot less common in DE is lengthy reports as they’re not always appropriate: “The big fat report is not as common in DE as it’s not very useful.” Alternatives include a slidedoc or infographic, or even a big data spreadsheet.
DE can unleash a community’s power and determination to succeed.
I asked whether DE ever uses Organisational Constellations, which is a remarkable method for revealing hidden dynamics in systems and situations by using the problem-holder’s own intuition to move elements representing the system around a space. These elements are often represented by people.
Kate said not that she knows of. Interestingly, I know of one Q founder member who recently trained in this approach – and have talked with her about sharing it with the community.
Collective sense-making: a critical skill for complexity and DE
Though I love jargon when it’s actually helpful and descriptive, I’ve always struggled with the term sense-making, but something began to click for me as Kate explained how important it is for the developmental evaluator to do the sense-making of the data with the project team and wider stakeholders. That is, involve the team in the debrief of emerging data, not only at a later stage.
“Don’t just do a report-back and ask questions; by then it’s too late,” Kate explained.
We must interpret the situation in and through interaction with each other: “There should be multiple perspectives in the design and the making sense of data.”
“’It was quite interesting but it wasn’t very helpful’, people often say. ‘All the sense-making had already been done…’”
Most DE is not DE
The great story on what DE can do – and is doing – is slightly sullied by the fact that it’s often DE in little more than name only: “Rarely actually do I see something that is actually DE. There are lots of traditional evaluations with bits of DE,” said Kate.
“Most DE that I see is DE-lite, or DE-like,” she concluded. She also noted that it’s still not all that often that developmental evaluators are brought in early at the design stage.
Some illuminating points emerged from other participants too, unsurprisingly. A Tavistock Institute researcher had an insightful warning about seemingly positive systems thinking: “Systems methods seek definitive answers but don’t in fact see their own biases and values. They shoehorn in a level of certainty, and bypass soft skills. We are in a paradigm shift to complexity evaluation, but not yet towards awareness of subjectivity and bias.”
Sounds like a call for a next generation of DE tools? Let’s get this generation of DE in place first!
I hope you’re as inspired as I am by this emerging approach to evaluation – and I’m not even an Evaluator.