Select Page

In this report we discuss unusually promising findings from a large, multisite randomized controlled trial (RCT) of the Year Up program for low-income young adults. This newly-published study found earnings effects that are among the largest of any workforce training program evaluated to date in a high-quality RCT. Our full review and summary of the study is linked here on the Social Programs That Work website; the following are brief highlights.


Highlights:

  • Program: A full-time, year-long workforce training program for economically-disadvantaged young adults that focuses on economic sectors with jobs in high demand—namely, information technology and financial services.
  • Evaluation Method: A well-conducted RCT with a sample of 2,544 low-income adults ages 18 to 24, who were neither in school full-time nor employed full-time, carried out at eight urban sites across the United States.
  • Key Findings: The program produced a 40 percent ($7,011) increase in participants’ annual earnings in the third year after random assignment, compared to the control group.
  • Limitations/Other Considerations:
    (i) Longer-term follow-up is needed to determine whether the large effects found in year three endure long enough to justify the substantial program cost ($28,200 per participant).
    (ii) Year Up carefully screens applicants and enrolls those identified as being motivated to succeed and interested in career advancement; thus, the effects may not apply to young adults who fall outside such criteria.

A thoughtful comment from the lead study author, and our brief rejoinder, follow the main report.


In addition to summarizing the evidence findings, we offer four brief observations on their implications for policy and research, as follows.

First, the study provides evidence not only that Year Up produces large effects on earnings, but that the effects generalize across diverse settings and subpopulations. Specifically, the earnings effects were sizable and statistically significant at each of the eight major U.S. cities where the study took place, and in all study subgroups examined (e.g., male, female, African American, Hispanic, white, low grades in school, high grades in school). The findings thus provide a convincing answer to naysayers who assert that positive RCT findings have little policy value because the world is complex and ever-changing, and what works in one setting tells us little about what will work in a different setting and sample. The remarkable robustness of the Year Up effects across study sites and subpopulations serves as a clear counterexample.

Second, Year Up is an example of a particularly promising type of workforce training called “sectoral” training. Sectoral training programs (i) focus on specific economic sectors where jobs are in high demand (in the case of Year Up, information technology and financial services); and (ii) work in close partnership with local employers in developing the training curriculum and providing work opportunities (e.g., internships) to program participants. In an earlier Straight Talk report, we discussed another sectoral program—Per Scholas—that has been found in two well-conducted RCTs to produce large earnings gains, and there are other positive examples, albeit with more preliminary evidence. Not all sectoral training programs have been found to produce significant effects when rigorously evaluated, but the promising overall pattern suggests that further investment in this approach—for example, to develop and test new sectoral programs to serve other types of workers—is warranted.

Third, Year Up’s cost—$28,200 per participant—is substantial. From a policy standpoint, it is somewhat mitigating that most of the cost ($16,700, or 59%) is borne by employers who provide internships to Year Up participants as part of the program and pay Year Up for each intern. Thus, initiatives to expand Year Up delivery with taxpayer or philanthropic funds could reasonably expect substantial co-funding by local employers. Still, the program cost is fairly high even taking such co-funding into account, and it may be worth developing and rigorously testing less costly versions of the program to see if they can achieve similar effects.

Finally, we offer what we believe is an important caveat to the positive findings: The effects have only been measured over the three years following random assignment, so it is not yet known whether the earnings gains will endure long enough to justify the program’s substantial cost. The national RCT of Job Corps, a workforce training program for disadvantaged youth and young adults, found a modest (5 percent to 10 percent) earnings increase in the third and fourth years after random assignment that faded out quickly thereafter. Year Up’s early effects are much larger, but longer-term follow-up is needed to hopefully rule out a similar pattern of fading effects over time. Fortunately, a six-year study follow-up is planned, so we can stay tuned for the answer.


Response provided by David Fein, lead author of the Year Up RCT study report

Straight Talk reviewers have done a solid job in assessing our recent report on Year Up. We especially appreciate their highlighting that large positive impacts appeared in every one of the eight local implementations tested—providing strong evidence for this program’s replicability. We share reviewers’ excitement at signs that positive evidence on well-implemented sectoral training programs is becoming quite strong and agree that expanded testing in this arena is warranted.

We too are keen to see how impacts hold up over the longer term. As reviewers observe, Year Up is costly compared to most workforce programs and thus may require a longer period of sustained impacts to reach the cost-benefit breakeven threshold. In addition to planned analyses over a six-year follow-up period, a formal cost-benefit study will address this question directly.

On reading a first draft of Straight Talk’s review, we did not grasp that the rubric used actually requires a positive cost-benefit ratio for interventions to advance from “suggestive” to higher tiers of evidence. We are grateful for a phone conversation clarifying that this point factored heavily into the Year Up rating, and it was good to hear that the initiative will sharpen up this aspect of its criteria going forward.

Whether or not Year Up proves cost-beneficial, reviewers are right in noting that it still may be too expensive to scale. So they are quite right to wonder if lower-cost versions of the model can achieve similar impacts.

In fact, Year Up is in the midst of an ambitious array of initiatives to develop and test more scalable models. With initial funding from the Institute of Education Sciences, a team of researchers from Abt and University of Pennsylvania is setting up an initial test of one such initiative: an adaptation called the Professional Training Corps, currently operating at over 15 college campuses.

—David Fein, Principal Associate, Abt Associates

Note: The views expressed here are solely those of the report’s first author and are not necessarily shared by Abt Associates, the Administration for Children and Families (the evaluation’s main sponsor), or Year Up.


Rejoinder by the LJAF Evidence-Based Policy team

We appreciate the lead study author’s thoughtful response. Among other things, it is good to hear about Year Up’s development and testing of lower-cost adaptations of their program, and we look forward to the results of those efforts.

Based on our communications with the lead author, we wish to clarify why Year Up is categorized as not yet meeting the highest evidence tiers (“Top Tier” or “Near Top Tier”) in our evidence summary on the Social Programs That Work website. The reason relates to the caveat discussed in our Straight Talk report: Year Up’s effects have only been measured over the three years following random assignment, so it is not yet known whether the earnings gains will endure long enough to justify the program’s substantial cost. The earnings gains to date are remarkably large—$5,181 per person in the second year after random assignment (i.e., the year following program completion) and $7,011 in the third year—but the total earnings gains still fall well short of the program cost ($28,200 per participant). If the earnings effects were to fade out shortly after the third year, following a pattern such as that found in the Job Corps RCT, Year Up’s costs would likely exceed its benefits and the study findings would therefore be disappointing from a policy standpoint. On the other hand, if Year Up’s earnings effects were to endure over time, following a pattern such as that found in a multisite RCT of Career Academies, Year Up would very likely meet the highest evidence standard (Top Tier) given the quality of the Year Up RCT, the magnitude and endurance of earnings gains in comparison to the program cost, and the replication of positive effects across multiple sites. We hope that the planned six-year study follow-up finds this to be the case.

We note that Social Programs That Work gave another workforce development program—Per Scholas—a higher evidence rating than Year Up (i.e., Near Top Tier) even though Per Scholas’ earnings effects were not quite as large as Year Up’s. The reason for Per Scholas’ higher rating is that it is a less expensive program, costing approximately $5,800 per participant, and the total earnings effects (approximately $4,200 and $4,800 in the second and third years after random assignment, respectively) exceed this program cost.

We thank the lead author for encouraging us to clarify the reasoning behind Year Up’s evidence rating. We have added this clarification to the Year Up evidence summary on the Social Programs That Work website, and plan to update the site in the near future to more fully articulate the rating criteria.