How can you find out if a program is likely to work before you waste money trying it?

If you’ve been working in international development for a while (or even a few weeks) you’re probably used to receiving a string of urgent emails from your boss on Monday morning asking you to write a concept note for the latest donor call by 5 pm Friday. They want you to propose some type of “evidence-based” solution for solving X. Never mind that your organisation has no experience in X, does not have a mission statement related to X, and you have no idea what types of programs on X actually work.

This situation can leave you in a difficult predicament – if you write the proposal and it gets funded, the organisation could end up wasting thousand (or millions) of dollars on a badly designed project that doesn’t work. If you don’t write the proposal… we’ll that’s not really an option because the NGO Director wants it on their desk by Friday.

So how can you find out what works (and what doesn’t) quickly? One of the best ways is to look at the results of previous programs run on similar issues. That way you can focus your efforts on interventions that have already been shown to work. Even if you’re playing the innovation card and making up a completely new pilot project (while at the same time coming down with a serious case of Innovation Tourette’s), you still need to know what has and hasn’t worked previously so you can suggest appropriate “innovations”.

Here are some of my favourite websites for quickly finding out the results of previous programs:

International Initiative for Impact Evaluation (3ie)

At the highest point on the pyramid of evidence, even higher than the revered Randomized Controlled Trial (RCT), is the systematic review. A systematic review takes all the available evaluations on a particular program or intervention (such as bed net distribution, or teacher training), and combines their results to determine whether that particular intervention is effective, ineffective, or their isn’t enough evidence to decide either way. Systematic reviews are like gold when you’re designing a program. Not only do they save you having to read all the individual program evaluations, they are also more likely to have  accurate results because they cover many similar programs. The best ones even tell you what features the most successful programs had, so you can include them in your program design.

The International Initiative for Impact Evaluations, also called “3ie”, has a large collection of systematic reviews on international development programs, some of which they funded themselves and others from external sources. This includes everything from systematic reviews of mass media interventions for HIV/AIDS, to child labour prevention and family planning counselling. In addition to systematic reviews, 3ie also has a collection of more than 700 high quality impact evaluations on individual programs. All the impact evaluations include a control group, so you can see what the outcomes of the program were compared to a group of people who did not receive it. They’ve also published a helpful series of policy briefs that explain what the findings of all those systematic reviews and impact evaluations actually mean for you in practice.


Givewell describe themselves as an “independent, nonprofit charity evaluator”. They spend thousands of hours reviewing NGO programs to identify the most effective ones, which they then recommend to private donors. Although they only recommend a tiny fraction of the programs they review, their collection of reviews and research is completely transparent, very impressive and extremely useful. Check out some of their reviews on the effectiveness of clean water programs, support for small and medium size enterprises, and condom distribution. Their reference lists are also a great source of links to other systematic reviews and summary reports.

USAID Development Experience Clearinghous (DEC)

USAID have set up an ambitious central database to store details of all their programs for the last 50 years, including more than 155,000 documents. The database has a huge range of features, including advanced search and geographic search. It also allows you to tag, vote and comment on documents, and even download copies for your e-reader device. If you’re looking for the results of particular types of programs you can limit your search to just program evaluations, although there are also many other interesting documents including handbooks, technical reports, and even “oral histories” from USAID staff working on projects. There are so many features in this database that it can sometimes feel a bit overwhelming, so it’s worth watching their 6 minute introductory video before you get started.

DFID Research for Development (R4D)

Similar to USAID, the UK Department For International Development (DFID) has also set up a database for all their program documents. It has more than 30,000 records and allows you to  search by country and by theme. While the database has a lot of useful information, it can sometimes be a bit difficult to find exactly what you’re looking for because they only have two file types – “projects” and “documents”. It would be great if they could break down the documents into different categories, such as evaluations, program reports, technical reports, etc, similar to the USAID site.

The Cochrane Collaboration 

The Cochrane library collects and publishes systematic reviews on all types of health interventions. Although some reviews are on very technical topics, such as the use of indoor residual spraying to prevent malaria, all of them include a plain language summary so it’s easy to understand what the main conclusion is, and how it might apply in practice. Many of the reviews are for programs implemented in high income countries, so you should always check what type of settings they were implemented in before relying on the results.

The Campbell Collaboration

The Campbell Collaboration is like the Cochrane Collaboration but for social issues such as education, crime and justice, and social welfare. They have a growing collection of systematic reviews on interventions such as building new schools, as well as details of future systematic reviews to be published, such as those on  land property rights, and microcredit. As with the Cochrane library, the Campbell library includes interventions from low, middle and high income countries.

Abdul Latif Jameel Poverty Action Lab (J-PAL)

J-PAL started as a research center at the Massachusetts Institute of Technology, but is now a global network of researchers who use randomized evaluations to find out what works in international development. Their website includes a collection of evaluations which you can search for by theme, goal, region and country.


AidGrade provides donors with evidence on the relative effectiveness of different types of programs. Their website allows you to select a specific program outcome (e.g. test scores, savings, malaria incidence, etc) and compare the effectiveness of different types of programs for achieving that outcome. For example, according to the AidGrade analysis, conditional cash transfers increase school attendance rates by an average of 3.21 percentage points, while scholarships only increase school attendance rates by 1.26 percentage points.

AidGrade uses meta-analysis of randomised controlled trials and impact evaluations to draw their conclusions. They also provide a tool for building your own meta-analysis that allows you to select the types of studies included. The main limitation of the site is that it only shows quantitative data on the effectiveness of a program without providing a more detailed discussion on how the programs were implemented or the context they were implemented in. This can make it difficult to know whether the program would be suitable for your situation.

Practical Initiatives Network (PIN)

The Practical Initiatives Network was launched in early 2013 to “provide a platform for development organisations to share ideas and learn from each other’s successful (and less successful) initiatives”. Organisations can upload descriptions of their programs, including the reasons for success or failure, to the website. Each initiative is then “pinned” on an interactive map of the world. There are only a small number of programs currently in the system, but it’s sure to grow as more people find out about it.

Admitting Failure

As its name suggests, Admitting Failure is about what doesn’t work, rather than what does. It was created to encourage more open and honest dialogue among organisations, and to reduce fear, embarrassment and intolerance of failure. Organisations can upload their own failure stories to the site and readers are encouraged to give their reactions. It’s definitely worth a look before you design any new programs – after all “the only truly ‘bad’ failure is one that’s repeated”.

Photo by IICD

Did you find this article useful? Support our work and download all templates.

About Piroska Bisits Bullen

Avatar photo
Piroska has worked on a range of international development programs involving local NGOs, international NGOs, UN agencies and government. She holds a Ph.D. in public health, has published articles in several journals, and was a speaker at TEDx Phnom Penh. Piroska is passionate about using scientific evidence and creativity to design programs that work.
Support our work ♡Download all templates