Select Page

In this report we discuss exceptionally promising findings from a large randomized controlled trial (RCT) of a Canadian program—Learning Accounts—aimed at increasing the educational attainment of low-income high school students. The latest study reports[1] show gains in high school and college graduation rates over a 10-year follow-up period that are among the largest of any North American program evaluated to date in a high-quality RCT. Our full review and summary of the study is linked here on the Social Programs That Work website; the following are brief highlights.


Highlights:

  • Program: A program in New Brunswick, Canada that provided financial aid for post-secondary education to low-income 10th grade students, conditioned on their meeting certain benchmarks (i.e., completion of 10th, 11th, and 12th grade). The maximum amount of aid—for a student meeting all benchmarks—was $8,400, and the average amount provided was $3,300.
  • Evaluation Methods: A large, well-conducted RCT with a sample of 1,145 low-income 10th graders.
  • Key Findings: Over the 10 years following random assignment, the program produced a 6.5 percentage point increase in the high school graduation rate, and 6.8 percentage point increase in the rate of postsecondary completion.
  • Limitations: A study limitation is that it was conducted in a single Canadian province. A replication RCT, conducted in another jurisdiction, would be valuable to hopefully confirm the sizable effects found in this study and establish that they generalize to other settings where the program might be implemented.

To provide more detail on the key study findings:

– 89.2 percent of the Learning Accounts group graduated from high school versus 82.7 percent of the control group; and

– 36.1 percent of the Learning Accounts group graduated from university or community college versus 29.3 percent of the control group.

Both effects were statistically significant (p<0.01). The effect on postsecondary graduation was driven entirely by an increase in graduation from community colleges as opposed to universities.

Based on these RCT findings of sizable effects on important life outcomes, Arnold Ventures is seeking—though our Moving the Needle request for proposals (RFP)—to fund an expansion of Learning Accounts to U.S. sites, coupled with a replication RCT to determine whether the above effects can be reproduced in the United States. If you would be interested in partnering on such a replication effort, please review the RFP and reach out to us any time.

In addition to summarizing the Learning Accounts evidence and noting the funding opportunity, we offer a few words on why we take these findings so seriously and would encourage others in the policy and research community to do so.

We know from the history of rigorous evaluations in education and other fields that surprisingly few programs, when studied in a high-quality RCT, are found to produce the hoped-for improvements in people’s lives.[2][3][4] Furthermore, most programs claiming to be “evidence based”—including many of those listed in various web-based repositories of evidence-based programs—are backed by only preliminary or weak evidence that too often does not hold up in subsequent, more definitive evaluations.[5][6][7][8]

Viewed in this context, the RCT findings for Learning Accounts are exceptional—and merit exceptional attention. It is by no means a sure thing that a faithful replication of Learning Accounts in the United States would produce effects similar to those found in Canada, given the differences in the educational systems and culture. But in a world where the odds of success for any new education program are long, this is a good bet. Notably, the Learning Accounts RCT found sizable positive effects not only in the full sample of 30 schools, but also in the subsample of 15 English-speaking schools and the subsample of 15 French-speaking schools, suggesting that the positive effects generalize across different educational settings.[9]

More generally, as the experience of the U.S. Department of Education’s Investing in Innovation (i3) Fund shows, using prior RCT evidence of substantial and important effects as the key criterion for making large grant awards greatly increases the success rate in funding truly effective programs, as measured by subsequent replication RCTs (see our earlier Straight Talk report on the topic). Early findings from the RCTs that our organization is funding show a similar pattern, which we will discuss in future reports.

We believe it is of great policy importance to know whether the effects found in New Brunswick can be reproduced in a second RCT in a different site or sites (preferably, from the standpoint of U.S. policy, within the United States). If the effects do reproduce, it would be a major achievement, establishing Learning Accounts as one of very few educational programs whose evidence base–because it is both rigorous and replicated—provides confidence that faithful implementation of this program on larger scale would lead to important educational gains for thousands, potentially millions, of low-income youth.


Response provided by the lead study author

We invited the lead study author, Reuben Ford, to provide written comments on our review. He said that he appreciated the opportunity but had no reason to provide further commentary.


References:

[1] Ford, Reuben, Taylor Shek-wai Hui, and Isaac Kwakye, “Future to Discover: Seventh Year Post-secondary Impacts Report.” Social Research and Demonstration Corporation, December 2018. Hui, Taylor Shek-wai and Reuben Ford, “Education and Labour Market Impacts of the Future to Discover Project: Technical Report.” Toronto: Higher Education Quality Council of Ontario, 2018. Ford, Reuben, Marc Frenette, Claudia Nicholson, Isaac Kwakye, Taylor Shek-wai Hui, Judith Hutchison, Sabina Dobrer, Heather Smith Fowler, and Sophie Hébert, “Future to Discover: Post-secondary Impacts Report,” Social Research and Demonstration Corporation, October 2012.

[2]How to solve U.S. social problems when most rigorous program evaluations find disappointing effects (part one in a series),” Straight Talk on Evidence, 2018.

[3]When Congressionally-authorized federal programs are evaluated in randomized controlled trials, most fall short. Reform is needed,” Straight Talk on Evidence, 2018.

[4]Beware the pitfalls of short-term program effects: They often fade,” Straight Talk on Evidence, 2019.

[5]How ‘official’ evidence reviews can make ineffective programs appear effective (part one in a series),” Straight Talk on Evidence, 2017.

[6]How What Works Clearinghouse reviews sometimes make ineffective education programs appear effective (part two in a series),” Straight Talk on Evidence, 2017.

[7]An important—but fixable—flaw in the What Works Clearinghouse that can make ineffective programs appear effective (part three in a series),” Straight Talk on Evidence, 2018.

[8]Evidence-Based Policy ‘Lite’ Won’t Solve U.S. Social Problems: The Case of HHS’s Teen Pregnancy Prevention Program,” Straight Talk on Evidence, 2019.

[9] The effects were impressive in both the English-speaking and French-speaking schools, but not identical. For high school graduation, the program produced a statistically significant 8.9 percentage point increase in English-speaking schools, but only a 4.2 percentage point increase (not statistically significant) in French-speaking schools. For postsecondary completion, the program produced only a 4.0 percentage point increase in English-speaking schools (not statistically significant), but a statistically significant 13.4 percentage point increase in French-speaking schools.