Loading...

Taking a complexity lens to understanding evaluation failure

Taking a complexity lens to understanding evaluation failure

In a new blog post on the CECAN website, Dione Hills suggests that a knowledge of complexity...

In a new blog post on the CECAN website, Dione Hills suggests that a knowledge of complexity and complex adaptive systems can help in understanding why some evaluations run into serious difficulties.

The blog post takes as its starting point a recently published book on ‘Evaluation Failure’ in which experienced evaluators describe 22 evaluations that went badly wrong, and what learning they took away from these. The choice of evaluation design, in itself, was rarely seen to be the primary cause of difficulties. In most cases, there were dynamics at play in the policies or programmes under evaluation which got in the way of, and sometimes totally prevented, an effective evaluation to be undertaken.

The evaluators often felt blamed themselves – and their lack of experience – for these evaluations being unsuccessful, describing their failure to spot and address ‘red flags’ at an early stage, or in some cases, not ‘calling time’ on an evaluation that was clearly going no-where. But an alternative view is to see these dynamics as having provided insight into – and data about – the policies and programmes themselves, particularly if interpreted through a ‘complexity lens’. Drawing on insights from complexity science, several of the dynamics described in the book can be understood in terms of key characteristics of complex dynamic systems. Inherently unpredictable, such systems are very vulnerable to changes in their wider context, often riven with tension and lack of agreement between different stakeholder groups involved, with individuals or groups in key ‘gatekeeping’ roles sometimes failing to allow the evaluator access to data, or rejecting well-evidenced findings.

Understanding all of this in complexity terms does not, of course, not guarantee that the difficulties can be overcome. However, taking this view of evaluation failure does highlight the importance of evaluators having ‘soft skills’ in being able to ‘read’ organisational dynamics, manage conflict and communicate clearly, as well as having ‘hard’ technical skills in evaluation design and research methods. Opportunities to learn these skills are sadly lacking in the evaluation field, and it hoped that the Tavistock Institute can address this gap soon, in developing new courses for evaluators drawing on Tavistock based understanding of group and organisational behaviour

This topic will also be explored in Dione’s keynote speech at the upcoming Norwegian Evaluation Conference next month (Sept 19-20th).

For more information on this and related topics, please contact Dione Hills at d.hills@tavinstitute.org

Subscribe to our newsletter

The Tavistock Institute of Human Relations | 63 Gee Street, London, EC1V 3RS
hello@tavinstitute.org | +44 20 7417 0407
Charity No.209706 | Design & build by Modern Activity