OMG –Respecting Evidence!

There’s the way things are supposed to work, and then there’s the way stuff actually works.

At my age, you sort of get resigned to the general cussedness of the real world….People mean well, but gee–so if an organization has a theory that didn’t exactly work out, it’s pretty incentivized to put a positive spin on it.

That being a fairly typical reaction to products or programs that didn’t do what their creators had hoped they would do, I was stunned–and excited–to read Vox article about a nonprofit that just came out and said “Well, I guess we were wrong.”

Last week, a major international development charity did something remarkable: It admitted that one of its programs didn’t seem to work.

No Lean Season is an innovative program that was created to help poor families in rural Bangladesh during the period between planting and harvesting (typically September to November). During that period, there are no jobs and no income, and families go hungry. By some estimates, at least 300 millionof the rural poor may be affected by seasonal poverty.

No Lean Season aimed to solve that by giving small subsidies to workers so they could migrate to urban areas, where there are job opportunities, for the months before the harvest. In small trials, it worked great. A $20 subsidy was enough to convince people to take the leap. They found jobs in the city, sent money home, returned for the harvest season, and made the trip again in subsequent years, even without another subsidy.

So Evidence Action, the nonprofit that funded the pilot programs of No Lean Season, invested big in scaling it up. In 2016, it had run the program in 82 villages; in 2017, it offered it in 699. No Lean Season made GiveWell’s list of top charities.

Evidence Action wanted more data to assess the program’s effectiveness, so it participated in a rigorous randomized controlled trial (RCT) — the gold standardfor effectiveness research for interventions like these — of the program’s benefits at scale.

Last week, the results from the study finally came in — and they were disappointing. In a blog post, Evidence Action wrote: “An RCT-at-scale found that the [No Lean Season] program did not have the desired impact on inducing migration, and consequently did not increase income or consumption.”

Why was this admission such a big deal? As the Vox article notes, it is exceptionally rare for a charity to agree to participate in a research project, to discover that its program as implemented doesn’t work, and then to actually publicize those results in a major announcement to donors.

It would have been easy, on multiple levels, for Evidence Action to do otherwise. It could have ignored or contested the results of the RCT; the research would still be published, but it would attract a lot less attention and publicity. Or it could have dismissed the failure as unrepresentative — there were unusual floods in Bangladesh in 2017, it could argue, which might have caused the program failures. Or it could have put a more positive spin on the results. After all, while the RCT was discouraging, it wasn’t devastating — there was, in fact, a small increase in migration.

Evidence Actiondid the opposite. “Consistent with our organizational values, we are putting ‘evidence first,’ and using the 2017 results to make significant program improvements and pivots,” the group wrote. “We are continuing to rigorously test to see if program improvements have generated the desired impacts, with results emerging in 2019. We have agreed with GiveWell that No Lean Season should not be a top charity in 2018. Until we assess these results, we will not be seeking additional funding for No Lean Season.”

Honesty. Respect for evidence. Respect for one’s donors.

This, of course, is the way things are supposed to work. This is why intellectually honest research is so important–to gather and consider evidence, and use that evidence to shape further efforts. To learn from reality, and to apply what has been learned in order to inform what we do going forward.

Empirical research. Honest evaluation of the results. Learning from our mistakes.

What a concept…..

Comments