This blog was written by Phalasha Nagpal, Consultant; and Dr Divya Nambiar, Principal Consultant and Hub Lead, both with Skills, Livelihoods and Education Systems at Oxford Policy Management.

The education and skills sectors command a significant share of funding from international organisations, national governments, philanthropies, and foundations.

Do such investments result in better outcomes?

Not always. Traditionally, funds are disbursed towards programme inputs and not for outcomes achieved. For example, in an education programme, investments target milestones such as increasing student enrolment, improving infrastructure, and training teachers. Do these investments lead to better learning outcomes for young people? If so, to what extent? Such questions remain unanswered. Further, the focus on inputs limits the incentive and accountability of actors to prioritise achieving outcomes.

The need to make social finance more results-focused is increasingly gaining traction. The Covid-19 pandemic has further reinforced this need by disrupting education, skills, and labour market access on the one hand, and imposing greater fiscal pressures on governments, donors, and funders on the other. As a result, we are confronted with a crucial challenge: to address the needs of vulnerable populations but to do so within financial constraints.

Results-based financing (RBF) through multi-stakeholder partnerships has emerged as an innovative financial instrument to overcome this challenge.

What is an RBF?

An RBF refers to any programme or intervention that provides rewards after the credible verification of an achieved result.

The key operating terms in this definition are credible verification and achieved result. The role of the evaluator who independently verifies the results is crucial as these results inform payments.

OPM is the evaluation partner on three key RBF programmes in the education and skills sectors. These include the Skill Impact Bond in India, The Sierra Leone Education Innovation Challenge (SLEIC), and the Education Programme for Results (EPforR) in Tanzania. Each of these is unique in its design, type of RBF mechanism, payment structure, and success criteria based on the context in which it is implemented. Consequently, the evaluation methodology varies vastly across the programmes. Some limit the focus to verifying results for payments, while others combine this with a strong learning agenda.

Drawing on our experience as the monitoring, learning and evaluation partner on these RBFs, we offer an overview of our approach to credibly evaluating such programmes. All three programmes are focused on improving the capabilities of people (e.g., education, skills, systemic reform) such that they are geared to drive better life outcomes for the people. In addition, these programmes also seek to redefine how success is understood and measured, with the aim of nudging a shift to more outcomes-based approaches.

What are some of the common features and differences across these programmes?

The three programmes are modelled on different types of RBFs based on specific project objectives and stakeholder priorities.

  • The Skill Impact Bond is based on the Development Impact Bond (DIB) Targeting 50,000 young people across India, it is currently the largest DIB in the skills sector globally.
  • EPforR leverages the payment-for-results RBF model. Payments are made based on the achievement of disbursement-linked indicators mutually agreed between the government and donors.
  • SLEIC uses the outcome-based funding mechanism to improve literacy and numeracy outcomes for over 134,000 pupils across 325 primary schools in the Sierra Leone.

Each programme addresses distinct challenges faced by the education and skills sectors, in three different countries. The metrics of measurement also vary across the three programmes – in terms of type of indicators (process, activity, output, outcome) as well as the level at which these are measured (individual, organisation, institution). 

  • Skill Impact Bond aims to overcome skills and employment gaps for India’s youth. With this focus, the programme links payments to three indicators i.e., the percentage of youth certified, placed in a job and retained for three months, which are measured at the trainee and training provider levels.
  • EPforR seeks to bridge quality, equity, and access gaps in the public education system in Tanzania. Payments are made based on pre-agreed improvements at the institutional level i.e., within the country’s education system through the verification of specific activities and reforms.
  • SLEIC focuses on addressing literacy and numeracy related gaps in primary school education in Sierra Leone through the assessment of learning gains.

As the predefined results of each programme are different, so are the methods for verifying them. OPM uses a range of evaluation and learning methodologies and frameworks specifically tailored to assessing RBFs. These range from RCTs, to ethnographic approaches, sample surveys to secondary documentary checks. In addition to methodological rigour, we consider what works best in a given local context to design each evaluation.

OPM’s approach to evaluating RBFs:

Pie chart demonstrating the three elements of the OPM Evaluator + role: We consider context as a starting point; We collaborate, consult, and cooperate; We learn, innovate, and implement.In all three programmes, OPM verifies the results which trigger the payments. Yet, our role goes much beyond the verification exercise. We call this the Evaluator + role. Three principles underscore OPM’s unique approach to RBF evaluations.

We consider context as a starting point. We recognise that it is the local politics, norms, institutions, and policies which influence programme implementation. We leverage OPM’s local presence and expertise in each of these countries to ensure that evaluations are endogenously informed and relevant to local contexts. To cite an example, in the Skill Impact Bond evaluation, certification of candidates post-skills training is a key payment-linked indicator. To compute this, typically, one would consider the receipt of a physical certificate as evidence of being certified. However, considering that in India, there is a lag of 3-6 months for physical certificates to be issued, we use evidence of candidates successfully completing a skills assessment after training for computing certification outcomes.

We collaborate, consult, and cooperate. We do this consistently across the programme cycle to answer the following questions.

 

What do we measure? To ensure complete clarity on what agreed-upon metrics we must measure. RBFs signify a shift away from inputs towards measuring outputs/ outcomes, many of which have not been traditionally assessed before.
What are the stakeholder priorities? To gauge the priorities and sensitivities considering multiple stakeholder perspectives (including national and state governments). We factor these into how we communicate the results with credibility.
How do we design the evaluation? To tailor evaluation methods, techniques, and processes to programme objectives and stakeholder priorities with statistical rigour and reliability as the most central tenet.
How should we communicate results? To communicate the results in ways that are clear and understandable to all stakeholders and help inform timely payments.
How should we facilitate learning? To gather stakeholders’ perspectives on how we can best offer additional insights for strengthening programme implementation, fostering innovation, and closing the feedback loop.

 

We learn, innovate, and implement. As evaluators, our role goes beyond verifying results. We strive to facilitate learning– to help drive innovation in every programme cycle. All the while, we also recognise that change management is complex and disruptive. Driving change requires deep and sustained stakeholder engagement which goes beyond communicating results and learnings. In this sense, we are the evaluator, learning partner and change agent working to drive systemic change together with stakeholders. Across all RBF programmes, we help unpack the ‘whys’ and ‘hows’ of programme performance in driving better outcomes with a strong focus on equity and inclusion.

Explore what goes on behind the scenes in designing, implementing, and evaluating RBFs with key programme stakeholders and the OPM team at the session at the UKFIET Conference in Oxford on Wednesday 13 September, 11:00am-12:30pm: Results-based financing in the education and skills development sectors: Perspectives from Academia, Policy and Practice Symposium.