What can TAACCCT teach us? Leveraging past lessons for the future of work
The Department of Labor Office of the Inspector General audit of TAACCCT skimmed the surface of what the program accomplished. But it'll take much more than that to inform future policy.
Blog Post
Shutterstock
Oct. 9, 2018
What can we learn from an ambitious $2 billion federal investment in preparing workers for a changing economy? That’s what researchers are aiming to understand now that the Trade Adjustment Assistance Community College and Career Training (TAACCCT) program has wrapped up. TAACCCT, signed into law at the height of the Great Recession in 2010, provided capacity-building grants to community colleges to support adults aiming to enter a new field or keep up with changing skill needs in their current occupation. Increasing automation and shifts in trade are shaping the landscape of the economy and changing the skills workers need to succeed. When any abrupt change occurs in the labor market – due to a recession or major technological shift – it’s critical for policymakers to understand best practices for implementing a large-scale investment that will support workers’ adaptation and success. And that’s why it’s so important to take advantage of what we can learn from TAACCCT now: rigorous analysis of these program evaluations can unearth lessons to prepare the country to quickly absorb future shocks in the labor market.
Last week, third-party evaluators began releasing final evaluations for the fourth and final round of TAACCCT grants. Rigorous analysis of program evaluations and data can provide policymakers and other stakeholders information about how such a massive public investment impacted participating institutions and individuals.
In July, the US Department of Labor Office of the Inspector General (OIG) released an audit of TAACCCT examining how well this historic investment in community colleges succeeded in meeting its goals. The OIG looked at rounds 1-3 of TAACCCT grants to find the total number of new programs developed using grant funding and overall program completion and employment rates. They also used a random sample of ten grants to explore if and how well these grantees met their proposed capacity and student outcomes targets. While the report raises important questions about the impact of TAACCCT that deserve attention, the way it went about answering those questions was fundamentally flawed.
For one, the diversity among TAACCCT funded programs means a sample of ten cannot offer insight into the full breadth and depth of grant implementation in terms of occupations and sectors of focus, types of new programs, or strategies to enhance or improve existing programs. Our teams - at New America and Bragg and Associates - are well on our way to reading the full set of TAACCCT final evaluations and can reliably report they represent a tremendous diversity of programs, industry sectors, and labor markets across the country. Though it has challenges, that’s part of the beauty of such an ambitious program. A random sample of ten grantees--out of 185 in the first three rounds of TAACCCT grants alone--isn’t nearly enough to give a valid and reliable picture of how TAACCCT-funded projects fared.
The OIG audit also addressed whether or not grantees had successfully met the target number of new programs over the universe of all grants in rounds 1-3, concluding that grantees substantially met that goal (1,992 new programs against a goal of 2,074). However, those findings don’t tell us anything about the quality of programs developed, if they were well-targeted to individuals who would benefit most, or if they were laying the foundation to sustain programs after the end of the grant period. Additionally, much of the capacity building in TAACCCT grants consisted of redesigning or enhancing existing programs, rather than creating new ones. Counting the sheer number of new programs developed using TAACCCT grants is only a very blunt measure of the success of the program and offers few insights into as to whether TAACCCT grantees were implementing proposed strategies successfully or changing policy. After all, TAACCCT aimed to improve outcomes for students in improved, redesigned and new, high-quality programs, not simply to increase the number of programs available.
Finally, using a rough calculation, OIG found that not quite 40 percent of individuals who entered programs developed through TAACCCT or were supported by TAACCCT completed their programs. Employment rates for folks who did complete the programs were modest afterward as well, with 44 percent of participants who were unemployed prior to TAACCCT-supported training getting a job after graduating. However, the OIG audit attempts to evaluate how well the TAACCCT program succeeded in meeting completion and employment targets using an aggregate figure across all TAACCCCT-supported programs in Rounds 1-3 of funding, regardless of the program interventions, industry, or region of the country. This approach masks a multitude of factors impacting each TAACCCT grant and the TAACCCT program overall, including vast improvements in the economy over the six years the grant was operating from 2011 to 2018.
Asking how well TAACCCT grantees met their intended goals around capacity and participants’ labor market success is important. But what is missing from the OIG’s report is a meaningful analytic approach that can actually answer that question and others. We need answers to complex questions like: Which types of TAACCCT-funded programs and implementation strategies have the most impact on student completion and labor market success? What policy lessons can stakeholders glean from TAACCCT implementation and outcomes to guide future investments in retraining programs? What can we learn from the student recruitment and support strategies used in TAACCCT, especially for adult learners and underrepresented students whose numbers are growing in postsecondary education? Delving deeply into the vast number of external evaluation reports using a rigorous research method, such as meta-analysis, would provide a much more useful picture for institutions to help students re-skill and for states to scale policy solutions.
Now that the final round of TAACCCT evaluations is becoming available, we need further analysis on what we can learn about TAACCCT’s overall impact, as well as a more granular understanding of specific strategies to improve outcomes for participants. Stay tuned for upcoming analysis from Bragg and Associates and New America analyzing TAACCCT grant implementation and outcomes across all four rounds of TAACCCT grants, including reporting on the successes, struggles, and lessons that can inform future policy. An in-depth analysis of TAACCCT will offer more than a snapshot of how a past program fared. It will also provide a base of evidence to guide future investments that prepare American workers for the challenges that an economic shift can bring down the road.