State Options for Using the American Rescue Plan to Evaluate Workforce Development Programs

Kristine Goodwin 3/8/2022

labor construction worker

Overview

As policymakers seek to build a workforce to help lead their economic recovery, research findings may point the way to interventions found to increase employment and earnings in high-demand sectors.

The good news: several programs have been rigorously evaluated and shown to be effective. Consider: Participants offered access to Per Scholas, an employment and training program for low-income workers in the information technology sector had increased earnings of 20% to 30%—or approximately $4,000-$6,000—two to six years after assignment in the program, compared to a control group.

While programs with strong evidence of effectiveness exist, evaluations of existing programs show most publicly funded programs are not achieving such results. As described in the first brief in this series, territories and other jurisdictions can use American Rescue Plan (ARPA) funding to scale up workforce programs and policies that have demonstrated impact on important outcomes like earnings and employment, as well as evaluate pilot projects and established programs to determine if they are achieving their intended results. 

This brief examines the law’s provisions relating to evaluations and highlights how states are incorporating evaluations into their workforce development policies and investments. By evaluating workforce programs, state policymakers can leverage these time-limited funds for long-term results.

ARPA and Evaluations: Opportunities and Requirements for States and Other Jurisdictions

Federal ARPA guidance encourages states, territories and other jurisdictions to invest funds in evidence-based programs, and allows states to use these time-limited funds for discrete costs such as program evaluations, data analysis and collection, and improvements to data and technology infrastructure.

What are evaluations and why do they matter? According to the U.S. Office of Management and Budget, evaluations use “systematic data collection and analysis of one or more programs, policies and organizations intended to assess their effectiveness and efficiency.” Rigorous evaluations can help policymakers make funding decisions based on evidence, such as scaling up effective programs, improving those with promising results, or redesigning others that aren’t working as expected. (See the first brief in this series [[insert link]] for a definition of rigorous evaluations.)

Federal guidance permits states and other jurisdictions to use funds to pay for impact evaluations (including randomized controlled trials and quasi-experimental designs). In addition, the U.S. Department of the Treasury’s January 2022 final rule, which goes into effect April 1, 2022, permits states to use funds to support building and using evidence to improve outcomes through rapid-cycle evaluations, process or implementation evaluations, outcome evaluations, and cost-benefit analyses. The rule encourages recipients to undertake rigorous program evaluations when feasible.

States and other jurisdictions are required to submit to Treasury and publicly post a recovery plan performance report. State recovery plans must describe efforts to promote equitable outcomes and detail whether State and Local Fiscal Recovery Funds are being used for evidence-based interventions and/or if projects are being evaluated through rigorous program evaluations.

4 State Policy Options for Evaluating Workforce Development Programs

While states have until the end of 2024 to obligate recovery funds, many are incorporating evaluations into their recovery plans. As described below, states are taking a variety of steps to fund evidence-based workforce programs and evaluate new and established programs that lack evidence of effectiveness. 

1. Determine evidence of effectiveness for existing and proposed workforce programs.

Federal guidance encourages state leaders to use clearinghouses, like the U.S. Department of Labor’s Clearinghouse for Labor Evaluation and Research and others to assess the effectiveness of their interventions. Other clearinghouses for employment and job training include Social Programs That Work and The Pew Charitable Trusts’ Results First Clearinghouse Database.

With information on program effectiveness evaluated and gathered in one place, policymakers and agency staff can systematically consider programs based on the strength and quality of the evidence behind them. If the goal is to increase employee earnings and taxpayer savings, for example, the Social Programs That Work database gives high ratings to Nevada’s Reemployment and Eligibility Assessment (REA) Program. As described in the first brief in this series, Nevada’s approach provides new unemployment insurance claimants with an eligibility assessment and personalized reemployment services. Two randomized controlled studies showed that those offered Nevada REA earned 15% to 18% more in wages three years later compared to those who were offered usual services, and received 9% less in unemployment benefits, resulting in a net savings to the government.

Several states rely on evaluation tools such as research clearinghouses to determine the evidence supporting a proposed or established program. For example, New Mexico’s Legislative Finance Committee’s (LFC) Program Evaluation Unit examined the state’s post-pandemic workforce development needs in a 2020 policy report. “In a post-pandemic environment, workforce development will be more important, but the impact of workforce development programs varies with the population served, with some programs having larger returns on investment,” the report found. Pointing to Results First Clearinghouse and evaluation findings, LFC noted Nevada’s REA approach has proved effective at reducing the length of time individuals receive unemployment insurance, while generating savings to the state. Based on the LFC’s report, the New Mexico legislature has dedicated $5 million towards evidence-based reemployment case management, one of the programs reported to be effective in the policy report.

The Washington State Institute for Public Policy (WSIPP), created by the legislature in 1983, conducts nonpartisan research at the direction of the legislature or Board of Directors. WSIPP works with legislators, legislative and state agency staff, and experts in the field to answer relevant policy questions and to research a program’s effect on achieving policy goals. For example, in 2019 researchers analyzed benefits and costs of employment counseling programs for individuals in the adult criminal justice community seeking employment.

2. Embed evidence and performance monitoring in the budgeting and contracting process.

Several states have taken steps, including prior to ARPA, to integrate evidence of program effectiveness and performance information into the budget process. This provides an opportunity for agencies to share critical information with policymakers about a program’s effectiveness and to improve procedures for prioritizing funds. For example:

  • In 2021, Colorado lawmakers passed SB 284, which requires agencies and the Office of State Planning and Budgeting to use consistent evidence definitions in budget requests. The legislation appropriated funds to add legislative staff to review agency budget proposals and established procedures to incorporate evidence-based research into the state budget process. Colorado’s evidence continuum (see Figure 1) provides a framework for describing the evidence supporting a program currently, and how a program can move along the continuum with evaluation and implementation support.
  • The District of Columbia has aligned its budgeting process to standards outlined in the federal Foundations for Evidence-Based Policymaking Act of 2018 and embedded experimental evaluations in multiple government programs. Each year’s budget cycle begins with a review of all proposals for new or expanded programs and services. Agencies must provide the evidence base supporting their budget requests.

In 2021, New Mexico’s Legislative Finance Committee launched a “LegisStat” initiative to engage legislative leadership and state agencies, including the state’s Workforce Solutions Department, in ongoing, data-driven performance reviews. The LegisStat process includes regularly scheduled time to focus on a key set of priority performance issues and seeks to collaborate with agencies in a way to drive performance improvements for New Mexicans.

3. Fund and evaluate pilot projects to identify which interventions are effective, while scaling up interventions that are found to work.

As described above, states can use recovery funds to scale up evidence-based programs and to evaluate programs that address the pandemic’s negative economic consequences.  Policymakers can use funds to pilot and evaluate new and innovative programs that are designed to address state needs but have not yet been rigorously evaluated, and thus their impact on important outcomes is not yet known. Several states have outlined in their ARPA recovery plans for using rigorous research to evaluate programs and build evidence for workforce development and other programs. For example:

  • Utah will spend one-third of appropriated funds on projects related to stemming the pandemic’s negative economic impacts, including connecting displaced workers with training and jobs and upskilling and educational investment opportunities that support women and people of color. According to Utah’s 2021 recovery plan, executive and legislative branch stakeholders are streamlining state performance management systems to “link budgets with performance measures and drive the best investment and use of Utah’s resources.” The state’s chief economist and a committee of evaluation experts will regularly analyze the data and include as part of an ongoing efficiency improvement process.

Some state leaders prioritize funding for programs based on tiered grant distributions, evidence continuums and other frameworks driven by the availability of research-based evidence. Some states have allocated funds to hire and train personnel to generate evidence or coordinate evaluation activities, while others have established grant programs to build evidence. For example:

  • Since 2017, Colorado’s Office of State Planning and Budgeting has awarded about $500,000 annually in grants to state agencies. Staff work with agencies to help programs build their evidence and progress along an evidence continuum.
  • North Carolina’s Appropriations Act of 2021 allocated $500,000 to the Office of State Budget and Management to provide evidence-based and competitive grants to state agencies. Agencies may use grants to partner with research institutions to conduct research projects and evaluate whether programs are achieving the intended results. State agencies are required to submit reports on the use of funds to the Joint Legislative Oversight Committee on General Government and the Fiscal Research Division.

4. Align workforce evaluations with broader recovery and performance goals and research agendas.

To address the pandemic’s negative economic impacts, many states have allocated or earmarked funds for workforce development initiatives. See NCSL’s ARPA tracking database and states’ 2021 recovery plans for more information on state allocations.

Additionally, states including Colorado, North Carolina, and Rhode Island have created pandemic recovery offices to align spending decisions with state priorities and coordinate efforts. North Carolina’s office will require recipients to report on the use of evidence-based interventions and/or if projects are being evaluated through rigorous program evaluations designed to build evidence.

Finally, Treasury guidance encourages states and other jurisdictions to consider how a learning agenda could support their evaluation efforts and drive their evidence-building strategy. Learning agendas can help agencies identify and address prioritized research questions that drive their evidence-building practices.

For example, Connecticut’s evidence-building strategy prioritizes investments for evidence-building, supports the allocation of resources for evaluation and data analysis, and development of a learning agenda for each of the recovery plan’s priority areas. The Office of Policy and Management’s learning agenda uses data to inform decision-making and programming for the state’s workforce education training system. The research agenda seeks to answer questions about the impact of short-term training programs on earnings and the return on investment for public workforce training programs.

Conclusion

States are adopting an array of steps to embed program evaluations into their recovery efforts. Evaluations can help identify which interventions are most likely to help states achieve their workforce development goals. As described in this brief, states can use time-limited funds for program evaluations and other evidence-building activities that can help inform program and spending decisions beyond the American Rescue Plan.