Overview

Achieving impact through research for development programmes (R4D) requires engagement with diverse stakeholders across the research, development and policy divides. Understanding how such programmes support the emergence of outcomes, therefore, requires a focus on the relational aspects of engagement and collaboration. Increasingly, evaluation of large research collaborations is employing social network analysis (SNA) for this purpose. In this paper, we use three applications of SNA within similar large research collaborations, through our work within evaluation of three Interidsiplinary Hubs of the Global Challenges Research Fund, to explore its potential as an evaluation method.

Our comparative analysis shows that SNA can uncover the structural dimensions of interactions and enable learning about how networks evolve through time. We reflect on common challenges across the cases including navigating bias that results from incomplete network data, multiple possible interpretations of results, and the challenges of making causal inference and related ethical dilemmas. We conclude with lessons on the use of SNA within monitoring, evaluation and learning (MEL) systems that aim to support both learning and accountability.

Key findings/recommendations

SNA can be useful as a monitoring tool when used to reflect on underlying assumptions about collaboration and resulting network structure. From our analysis we conclude with three lessons for future use of SNA within evaluation of large research collaborations:

  • The more explicit assumptions about collaboration are at the outset, the more useful the empirical results are to programme learning. A contextualized theory of collaboration should be created at the outset to guide the SNA study. This is in line with Davies’ (2009) call for a theory-based and deductive approach to SNA in evaluation.
  • Combining SNA with other methods can enhance interpretation and reveal how structural assumptions shape it. This can strengthen causal inference about relational causal mechanisms making SNA a necessary, but not sufficient method to evaluate large research collaborations.
  • Navigating the challenges of interpretation and ethical dilemmas requires careful consideration as well as an enabling institutional and political environment for use of SNA to support learning. Embedding the interpretation of SNA findings within participatory learning moments would strengthen the use of SNA findings in learning-oriented evaluation design.