So here’s another superficial hypocrisy. A neat model to harness all this infinite complexity I like talking about. That said, if there were ever any model that I thought had value, it is this one. I like the models I’ve shared, hence why I’ve shared them. It’s a lot easier to do so safe in the knowledge that they aren’t ‘mine’ – they didn’t come into existence in some vacuum inside my heed. This one is no different and leans heavily on work done by Dan McClure and Thoughtworks. I’ve translated it to more resonant language, tweaked it, assessed it in a different light and added some applications and detail, but the inspiration, the root of it all, definitely comes from them.
A part of me would like to call it the AKALA model – in honour of one of my heroes, or AKILI cos Swahili words of aspiration have long been all the rage and, well, why not disappoint? So unnamed it shall remain. Let’s see if à lack of a bodged acronym will lead to its inevitable downfall.
If you, like me, have a brain that can’t hold huge amounts of information, this model might just help you package it in a way that becomes more useful. Effectively that’s all any model does, but let’s hope it does so here too. This one is the basis of the last two M&E strategies I’ve developed and I think the opportunity of its use is much further reaching. I know – it’s not usual to have this level of opinion nestling with positivity. Full o surprises, me.
All of the various interpretations of M&E, MEL, MEAL, PMEAL, DMEAL, CRM, CEA, Accountability, Programme Quality, Evidence, Learning Impact, Management Information, Performance etc. etc. etc. they all should subscribe to this. And no, that’s not one of those opinion things. It’s fact. There are (guess how many) 3 objectives within that medley of aspirationally analytical specialisms listed above:
- Accountability – of the programme to various stakeholders (including donors, beneficiaries, non-beneficiary community members and partnees) to ensure that we are transparent, that they are informed and more importantly, involved
- Adaptation – of programme (including M&E) activities to ensure that they achieve and maintain relevance, are responsive, and – ultimately – impactful
- Knowledge – of programme context, performance, progress and change to provide learning for existing and future programmes, be that in your claNGO or another
What fuels them? Well, insight. Yep, a novelish term to add to the long list of others that long lost their specificity and utility. Insight is the central tenet that drives the other three. You could easily call it Learning should you so wish, but either way it involves the proactive pursuit of insightfulness. Perhaps we could call it both – gathering and garnering learning/insight that fuels accountability by proactively seeking participation and listening to feedback. That fuels adaptation by informing programmatic decisions. That fuels knowledge by collecting information in a systematic way that can thusly be analysed and shared.
That’s M&E&A&L&P&D. That’s all it is. If you can create ways in which you generate insight/learning that fuels those three elements, then, yours is the specialism and everything that’s in it. The focus, for me, is not on the outputs, but the fuel, the fire in the middle. How do you feed this furnace? It’s pretty convenient to be honest. The same principles can be used to categorise incoming information – the insight itself:
- Accountability: beneficiary / community / stakeholder feedback – what are our key stakeholders (those we should be accountable to) telling us about our programme?
- Adaptation: programme teams’ experiences / colleagues’ feedback / specialists’ insight / programme data – what are our teams and data telling us? What has been their experience? Their opinion of influences and causality on the programme?
- Knowledge: external sources / reports / evaluations / alerts / coordination groups – what other sources of information can we look at and into that can help improve our programme?
As a framework, it’s genuinely not a terrible way to start. Ensuring that quarterly review meetings, for example, you have those types of data, those insights available. That they rely on hard data as well as opinion and experience. That you have quant and qual together, ideally filling the gaps the other leaves. And if you want to get super fancy, cross reference those against Context, Performance, Progress, Change to create an analytical framework of sorts:
Information / Insight
And if you’re at that level of complexity, why not add another layer and frame the outcomes, actions and learnings around such things as well:
So now you have a framework for what information to bring to this table. Plus a framework for how to shape the actions. For once, I’ll even give you a specific example of how I’ve used this as this is a special case. You can see how various lenses are applied and populated in alignment. Because that’s the kind of strategy I like. An accordion. On a level, simple, resonant, guiding. On another, layered, deep, detailed. It means that you can describe what you’re doing in a sentence or a book. For me, I aim to build M&E&A&L&etc. teams, functions and systems that develop the insight necessary to fuel accountability, adaptation and knowledge. It feels right to me.
|Behaviours:||Quality -> Listening||Responsiveness -> Acting||Insight -> Critiquing|
|Outcomes:||Enhanced community engagement and accountability practices that enable more informed programming||Enhanced evidence-based adaptation and review that fuel programme reflection and responsiveness||Deepened and formalised insights into focus quality areas, sectors and sub-sectors that lead to better quality programming|
|WHAT WE DO||Seek to generate feedback loops|
Feedback to communities / closing the loop
Develop more qualitative insights
Work with CRM to bolster feedback as well as complaints
|Use data to inform decision making|
Inform quarterly review meetings
Provide prompt data and insights to act upon
|Develop more analytical insights in reports|
Develop longitudinal comparisons
Make knowledge accessible through style/format and proactive sharing
|HOW WE DO IT||Curiosity, listening, voice|
Providing opportunities for opinions to be voiced
Ensuring that we respond with clear communication
|Pace, responsiveness, flexibility|
Delivering rapid and usable data and insights
|Reflective, thoughtful, collegiate|
Providing angles, well caveated and evidenced insights
Thought provoking and action oriented (the ‘so what?’)
|Skills / People||Qualitative data collection and analysis|
|Identifying what is interesting / useful||Deeper analysis|
|Processes / Tools||Quote (& consent) collection|
|Quarterly M&E reporting|
Priority planning / work plan (as a team)
Reflection / analysis tools (prompts of angles / checks)