ODI recently published a working paper by Harry
Jones, called ‘Taking Responsibility for Complexity’ (Working Paper 330, June
2011). The text is not an easy read, but
makes it up in interesting content, that it contains.
I created a Concept Map from the text,
outlining and connecting its main ideas.
It doesn’t replace for reading the text though and I would certainly
recommend giving it a thorough read.
However, it may convince you to read the text and may contribute to its digestion.
Concept Map of ODI Working Paper "Taking Responsibility for Complexity" |
The author makes an apt analysis of some
problems that follow from addressing complex situations as simple situations,
dominated by assumptions, log-frames and policy cycles. He hits the mark when he notes that many reporting
is quite separated from the daily work, and that plans, reports and tables tend
to be used only for reporting purposes.
Complex situations often trigger more in-depth analysis, more elaborate
reporting requirements and a tighter watch on indicators. Local staff flights in a compliance strategy,
characterised by risk aversion, instrumental use of tools and a focus on ‘low
hanging fruits’.
Planning,
monitoring and evaluation (PME) is necessarily a tick-box exercise (to fit in
with unrealistic assumptions embedded in the tools) drawing efforts away from
the ‘real work,’ to justify projects ex post and explain how everything went
according to the plan initially set out (whether or not this was in fact the
case). (p.13)
Studies
have shown that, in this context, M&E is often carried out to ‘prove not
improve’: for example, monitoring activities frequently revolve around
reporting on expected indicators as predefined in a log frame, rather than
providing real space to look at the unfolding effects and side-effects of an
intervention (Bakewell and Garbutt, 2004).
Complex situations are often encountered in
development practice contexts. In
Cambodia we link high repetition and dropout rates to the quality of
teaching. Increasing teaching quality
through the introduction of student-centered methodologies will reduce dropout
rates. Many other factors that affect
dropout rates are not forgotten, but considered as assumptions outside the
scope of the programme.
The
more difficult the problem, the greater the perceived need for careful
planning, intricate assessment and consultation and negotiation with partners
and interest groups before anything is done.
Implementation is firmly fixed in advance, with programmes and projects
tied to specific activities and outputs that result from extensive, even
multiyear, negotiations. Efforts during implementation are then restricted to
following a rigid preset schedule and plan of activities. (p.12)
But how to deal effectively with
complexity? Assumptions are often
outside the scope of small development organisations such as VVOB, even if we
would take complexity into account. Complexity
might be used as an excuse to dodge responsibilities for not reaching
goals. One possible strategy is to free
up resources and time for a wider range of activities that may affect the
assumptions such as advocacy. For
example, in Cambodia we have spent considerable time helping the Ministry of
Education updating the teacher training curriculum. Although not directly within the programme’s
objectives, it does affect the assumption that the teacher training curriculum
should support student-centred education. Another strategy is to move away from
simple, preset indicators that give the illusion of measuring outcomes and
impact of the programme, and move towards ‘principle-based’ and ‘mission based’
monitoring and evaluation (p.27).
The text doesn’t put forward a simple recipe
to deal with complexity, but rather a set of principles, concepts and case
studies that may be useful in certain contexts and are loosely based on
concepts of complexity theory. It
advocate sweeping away traditional tools and instruments, but recognizes that a
combination of tools is likely.
What
is clear, however, is that complexity can no longer be swept under the carpet.
While there is not yet one comprehensive framework, there is a growing
collection of models, tools, and approaches to effectively develop
interventions in the face of these multifaceted problems. (p.x)
Shaping
policy will always be a matter or degrees, and a negotiation between bottom-up
and top-down structures, between planned and emergent responses and between
technical and participatory guidance. (p.21)
Decentralisation is a central ingredient of
a strategy in complex situations.
One
aspect is that decentralising tasks within government will often require
building capacities at lower levels of organisation – in local government
bodies and elsewhere. There may be a ‘chicken and egg problem,’ whereby there
is reluctance to decentralise tasks to lower-level units until they have proved
their capacity to carry them out, even though it is impossible to do this
until decentralisation has actually occurred. One solution is to begin by
decentralising simpler tasks for which lower-level capacity is clearly evident
or for which the costs of failure would not be severe (Marshall, 2008). (p.25)
However, one major criticism of pilots is
that too often they are not allowed to ‘fail,’ and hence they provide less
opportunity for learning. The importance of ensuring that you can learn from an
intervention is emphasised in Snowden’s concept of ‘safe-fail experiments’
(2010): these are small interventions designed to test ideas for dealing with a
problem where it is acceptable for them to fail critically.
Constantly questioning one's strategy is crucial in efficiently obtaining long term impacts. A paper like this provides inspiration to distance oneself from the strategy that we may take for granted too often, and adopt a critical approach, realizing that the chosen approach may not be the only (or the best) way to success.