Are we prepared for timely evaluations? Reflections on the Centre for Evaluation’ symposium on “Timely Evaluation for Programme Improvement”

By: Nisso Nurova

LSHTM MSc Public Health student

Evaluations of interventions and programmes are often retrospective and therefore used to provide groundwork for future interventions. This poses challenges due to the need for faster and more adaptive mechanisms to inform programme improvement  , and with that ‘timely evaluations’. The symposium sought to speak on this topical issue with presentations from multiple backgrounds including by Val Curtis and Elizabeth Allen from LSHTM, Martin Dale from PSI, Claire Hutchings from Oxfam, and Jean Boulton and James Copestake from University of Bath.

During the day, the audience broke up into groups to discuss relevant questions on timely evaluations. My groups’ thought-provoking question was: ‘Have we got a research system that is fit for our desire for timely evaluations?’. While discussing the systems we use, we quickly realised that we need to do much better. We need greater adaptations, flexibility and sensitivity in our interventions to reflect the fast pace at which contexts and needs change. But how can we achieve this when there are so many players in the game, most of whom with their own incentives and intentions. We also noted that although there are challenges ahead, process and evaluation silos is a problem recognised and complex intervention methodologies are moving in the right direction.

One presentation that spoke to many of the ideas from our group discussions was delivered by Jean Boulton who discussed Complexity Science approaches for evaluation. Jean showed how complexity science emphasises that all things are interdependent, shaped by history, context specific and episodic. Interventions and evaluations have to look at detail locally and contextually. Using Syria as an example, Jean described how we overestimated the stability of the country because we did not appreciate the multiplicity of its’ current state. When considering dynamics such as the change in economy, society and environment together – we could have been more predictive of the conflict. But there are limits to knowledge and some tipping points cannot be foretold. With this lesson, a relevant question in the audience was raised: is it complex or just complicated?

The audience and speakers agree that we have to be sensitive to the indicators of health that may be outside of the box in which we first designed our interventions. We must therefore see multiple perspectives with subjective views. As we discussed furthering ourselves from randomised control trials to embrace the nature of complexity (and subjectivity), the room buzzed with ideas. But are we prepared to accept ‘weak’ evidence to make decisions? And more importantly, will donors make this leap with us? We left with thoughts wondering what role donors should play in framing evaluations that are more flexible to the changing nature and needs of interventions and programmes. We know that we need more timely evaluations, but with the increasing complexity of interventions and pressure for speed, is science enough?

Blog reports on the Centre for Evaluation’s Symposium: Timely Evaluation for Programme Improvement, held at the Wellcome Trust on 23rd November 2017.

Back