The Great Evaluator: July/August 2013 | STATE LEGISLATURES MAGAZINE
Steve Aos helps lawmakers in Washington be shrewd stewards of taxpayers’ money. His methods may be coming to a statehouse near you.
By Jonathan Kaminsky
On a May morning shortly after the Washington Legislature has concluded its regular session, a trim, soft-spoken man in his early 60s welcomes a visitor to his office a few blocks from the Capitol.
One wall is taken up by dozens of oversized three-ring binders with titles like “Drugs & Labor Market” and “Mental Health Model Parameters.” The other is filled with tomes ranging in topic from criminology to econometrics. Reading material aside, the simple desk, austere swivel chair and second-story view of Olympia’s unglamorous downtown do not suggest the workplace of one of the most celebrated minds in evidence-based policymaking.
But Steve Aos, director of the Washington State Institute for Public Policy—an entity with the responsibility of answering specific policy questions posed by the Legislature—has led to a transformation in the way state lawmakers think about funding a broad array of government programs.
In so doing, Aos (sounds like dose) has attracted the attention of the MacArthur Foundation and the Pew Center on the States, which have embarked on an ambitious, seven-figure effort to replicate his method in 14 other states, and counting.
Asked about the high level of interest his work—especially in criminal justice—has generated beyond Olympia, Aos smiles graciously. (He has lectured on his research methods 243 times, to audiences as far afield as the United Kingdom and Australia.) It is “nice and gratifying,” he says.
“It’s more exciting to me that it’s being used to the degree it is in Washington state,” he says. “You have an interest in getting better outcomes and using taxpayer dollars more efficiently, making public policy work better, in your own place.”
Aos’ primary role is to bridge the disparate worlds of academic researchers and political decision makers by distilling and interpreting the findings of the former for use by the latter.
It’s hard to find anyone willing to speak ill of how he does it. Senator Bruce Dammeier (R), vice chair of the Early Learning and K-12 Education Committee, is effusive in his praise of Aos. The institute’s research “has been really important” in guiding lawmakers’ thinking on a range of education issues, he says, including informing his position that teachers’ advanced degrees subsidized by the state must be specific in nature to stand the best chance of boosting student outcomes.
Senator Jim Hargrove, a notoriously prickly six-term Democrat and former chairman of the Senate Human Services and Corrections Committee, agrees. The institute, he says, “has been crucial in not only gathering the results of the programs that we’ve been doing, but also in looking at national research and bringing us models for programs that are showing results elsewhere, so that we’re not just guessing all the time about stuff.”
“I wish they had more capacity,” Dammeier says. “We need people like that to highlight the areas we should be focusing on and to know whether the money we’re investing gets an outcome. That way it’s not just rhetoric, it’s not just debate, but we’ve got a basis to make a sound decision.”
The institute’s work appears to have borne fruit. By its own estimate, criminal justice policies it has recommended that have been adopted by the state since 2000 will save taxpayers and crime victims a total of $2.77 billion by 2050. In addition to returning a benefit six times greater than the money spent, these policies have also led to 1,200 fewer people in state prisons, Aos says. Meanwhile, the state’s violent crime rate dropped by more than 20 percent between 2000 and 2011, the latest year for which figures are available.
Whether measuring the cost-effectiveness of education, health care, criminal justice or other taxpayer-funded programs, the WSIPP model is carried out in three steps.
1. WHAT WORKS?
To begin, staffers track down and pore over all available peer-reviewed research conducted on the program to be analyzed. Take, for example, cognitive behavioral therapy, a treatment program for adult criminal offenders that the Legislature asked WSIPP to assess in 2009. After discarding studies with questionable methodologies—“the lousy research,” as Aos calls it—he and his staff use the remaining data to determine what works and what doesn’t. With cognitive behavioral therapy for adults, Aos says, the institute reviewed 39 studies. All told, those studies indicated that the treatment is effective: It reduces the rate that offenders commit a new crime and return to prison by about 8 percent.
2. WHAT’S ECONOMICAL?
The next step is to put a dollar value on both a program’s costs and its benefits, both to taxpayers and to affected individuals, such as crime victims. Because the state had already enacted the cognitive behavioral therapy program on a limited scale, the institute was able to peg the cost per participant at $500. And because of the work from Step 1, staffers knew it was expected to reduce recidivism by 8 percent. In order to unpack the latter figure, they dug into questions such as how long would it take before future crimes are avoided, how many of those crimes would be serious or minor, and how many crimes would result in the offender being apprehended and sent back to prison. Ultimately, says Aos, “you have to do that math right, so when you hit the ‘run’ button on the model it actually computes, or monetizes, what the benefits are of that 8 percent reduction in recidivism. So you can take the benefits and stack them up against that $500 cost and see whether the program is likely to be a good investment.”
3. WHAT’S THE BEST MIX?
Finally, the institute suggests how much of a set of programs in a policy area the state should invest in. Often, this takes the form of a fiscal impact statement for proposed legislation. To reduce crime, for instance, the institute might suggest specific funding levels for programs including cognitive behavioral therapy for adults, similar programs for juvenile offenders, drug courts, policing levels, and prison beds, among others. These recommendations, like the cost-benefit findings of Step 2, are based on the findings yielded by the institute’s computer model, which Aos and his staff have honed over the past two decades.“This step, which we call portfolio analysis, is where we say how much the state should put into juvenile programs versus adult programs versus prisons versus early childhood education, if the goal is to reduce crime,” Aos explains. “We’re looking for the portfolio that will give taxpayers both the best crime reduction and the best use of their taxpayer dollars.”
The Washington State Institute for Public Policy’s mission is to conduct useful, non-partisan research that helps policymakers, and in particular lawmakers, make informed decisions about long-term, important issues facing the state.
The Legislature created the institute in 1983 to bring the state’s higher-education expertise to bear on the state’s social and economic problems. The institute has since expanded to 16 staff, including 12 researchers who cover issues including criminal justice, welfare, health and utilities.
Policy questions posed and passed by the Legislature are the primary driver of the institute’s work, though with the approval of its 16-member board, it also conducts research for outside entities. The board is structured to be nonpartisan, and is co-chaired by lawmakers from each party. Half of its members are lawmakers evenly divided among the four caucuses. Four are public university administrators, two are gubernatorial appointees and two are the respective heads of the House and Senate nonpartisan staffs.
It is funded each biennium through a legislative appropriation to The Evergreen State College in Olympia.
In 1995, shortly after Aos joined the institute as associate director, he quickly immersed himself in his first legislative assignment. Eager to combat a years-long wave of violent crime sweeping the nation, lawmakers had funded a pilot program of intensive supervision of juvenile offenders. They wanted to know how it was working.
After assessing the available research on similar programs around the country, Aos gave lawmakers the bad news: It was ineffective. Then, at the Legislature’s direction, he found other programs, ones that used a cognitive behavioral therapy approach, to be more promising. Following his advice, the state shifted funding accordingly. The results were good, and his reputation among lawmakers was established.
The experience was both gratifying and eye-opening, Aos says. Coming from a background in energy policy, where return-on-investment analyses were standard procedure, he assumed that similar work was done in arriving at best practices in juvenile justice.
“I found out that there wasn’t. There was very little such work done in human services in general. Not just in Washington but anywhere around. You could search forever and not find anything,” he says. “So what I wanted to do was to ask that economic question that was asked in other areas of public policy and apply it to the human services area.”
Aos has developed a uniform approach to answering the increasingly diverse questions his institute receives from the Legislature. After analyzing all the research on a program to determine its effectiveness, the institute produces an estimate of how much it is likely to cost or benefit taxpayers.
Aos notes that those tied to programs the institute has labeled as poor investments at times voice their disapproval. “In which case I say, ‘How we came up with that judgment is all written in the technical appendix. If we left out a study, please let me know. This is how we coded them. If we made a mistake please let me know that too.’ It’s in a model, and it’s calculable,” Aos explains.
Finally, Aos and his staff recommend a portfolio of possible investments. For example, to yield the lowest crime rate at the best price, they’ll point to an ideal mix of, among other funding priorities, prison beds, cops on the street and treatment for ex-cons.
Even the most finely tuned, easily digestible data can be trumped by political calculation, however. When this happens, Aos says, he channels the professional baseball player he dreamed of becoming while growing up cheering for the Los Angeles Dodgers.
“I try to take a hitter’s mentality to what we do here,” he says. “We produce the best work that we can. Sometimes it’s going to be used directly and immediately in a budget allocation. Sometimes it’s not. And you can’t throw your helmet and stomp because you’ll lose your mind if you do that. You keep going forward. You always hope for that hit.”
Jonathan Kaminsky is an Olympia-based reporter who has covered the Washington Legislature for the Associated Press.
Gary VanLandingham, director of the Pew-MacArthur Results First project, is the man in charge of bringing Aos’ vision to capitols across the country. The institute’s model, he says, has the potential to fundamentally change the way state governments operate.
Over the past 30 years, VanLandingham notes, revenue forecasts relied upon by state lawmakers to write budgets have evolved from relatively crude, simplistic estimates based on static economic growth rates to sophisticated and powerful computer models that, while not perfect, give a far clearer view of future tax receipts. Aos’ model, VanLandingham says, can be seen as the prototype for a similar approach to the expenditure side of the ledger.
Two years in, he says, results are already apparent. New York Governor Andrew Cuomo’s office is using the model to look for ways to make the criminal justice system more efficient. In addition, VanLandingham says, New Mexico recently used the Aos approach to help evaluate a prison drug treatment program. After learning it was under performing, lawmakers eventually decided to shift its $2.2 million in funding to a more promising effort. Other states, like Rhode Island, he adds, are coming on board as well.
“Now is the time for this work to proceed,” he says. “It’s tremendously exciting to be at the beginning of what can be a transformation in how government makes its toughest choices.”
As for Aos and his institute, there is no resting on laurels. Having applied its brand of cost-benefit analysis from juvenile justice to adult corrections, early learning, K-12 education, child welfare and health care, it recently received a new assignment. With the passage of a voter initiative last year legalizing the recreational use of marijuana, Aos’ staff has been charged with evaluating the costs against the benefits in carrying out the groundbreaking initiative. Their first report is due in 2015.
With this ever-expanding purview, are there any topics Aos secretly pines for an opportunity to delve into? “No,” he says. The institute is kept so busy that he has no time for such concerns. Then he pauses.
“We’ve never done anything in transportation. I’d like to do that sometime. It’s a topic of importance to the state.”