Case Study

Improving Academic Success and Retention: Year Up’s Professional Training Corps

By Jessica Britt, David Fein, Rebecca Maynard, and Garrett Warfield
|
July 2021
Download PDF

This case study describes a small randomized controlled trial (RCT) comparing alternative strategies for monitoring and supporting academic achievement in Year Up’s Professional Training Corps (PTC) program. Year Up is a nonprofit organization dedicated to preparing economically disadvantaged young adults for well-paying jobs with advancement potential. 

This study is one of several conducted by Year Up and its research partners at Abt Associates and the University of Pennsylvania to address questions related to the development and effects of the PTC program (Fein et al. 2020). This study relied primarily on extant data to test strategies for improving participants’ success during the PTC’s initial six-month Learning and Development (L&D) phase, which required full-time college coursework. Success in L&D was necessary to progress to the next six-month phase of the program (full-time internships).

The study team worked closely with practitioners to generate actionable evidence for guiding program improvement. The tested innovations were a local response to academic difficulties staff perceived to be a major cause of participants dropping out during the L&D phase. The researcher-practitioner team (including local PTC staff) used feedback loops to tweak the improvement strategies during and after testing.

As part of the study, Year Up and its research partners designed and tested strategies for more quickly identifying and supporting participants struggling with their college coursework. To ensure the credibility of the study findings, the team randomly assigned participants either to a coach who would use the new monitoring and support strategies or to a coach who would follow existing practices, which did not place much emphasis on academics. 

Over two cycles of testing, the study found strong evidence that the alternative strategies tested substantially improved success in courses and, thus, advancement to internships. Three factors contributed to the resulting evidence being actionable: (1) it focused on low- to no-cost, field-initiated practice changes that could be implemented widely; (2) site staff could tailor strategies to local needs and opportunities; and (3) the research team encouraged local staff to modify the strategies being tested between enrollment cohorts.

This case is one in a series commissioned by the Actionable Evidence Initiative in 2020 and 2021. The series illustrates how researchers, evaluators, practitioners, funders, and policymakers across the country are exemplifying principles of the Actionable Evidence framework. It profiles a range of settings, actors, learning questions, methods, and products, unified by a commitment to practitioner-centered, timely, practical, equitable, and inclusive evidence building. Each case describes the origins, development, and results of a research or evaluation project, along with the authors’ reflections on their experiences. Our hope is that these cases will provide both inspiration and practical guidance for those interested in generating and using evidence that leads to better and more equitable outcomes for youth and communities.