The Working Paper is the official newsletter of the National Legislative Program Evaluation Society. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Summer is my favorite time of year. I thoroughly enjoy the warmer weather, all of the lush vegetation and the longer days. The summer months can be incredibly busy, but I still find time to refresh and recharge. And we all need time to refresh and recharge with all of the activities going on!
The NLPES Executive Committee held a successful meeting on April 22, 2017, in Madison, Wis. During the meeting, we previewed the hotel meeting space, guest rooms, and the Wisconsin State Capitol building and grounds that will be the setting for this year’s annual professional development seminar (PDS) to take place Sept. 17-20, 2017. The Wisconsin Legislative Audit Bureau is putting together a great three-day conference program. Registration and details about the PDS are available on the NLPES website. The PDS is a tremendous amount of work to organize, and I appreciate the energy and effort of our Wisconsin hosts and all of the NLPES member offices who will participate in this premier training event.
In addition to the fall PDS, NLPES is co-sponsoring a number of sessions of interest for legislative staff at the annual NCSL Legislative Summit to take place Aug. 6-9, 2017 in Boston.
I would like to thank the Professional Development Subcommittee for coordinating two successful webinars this past year. They’re already working on topics for more webinars. I would like to thank the Communications Subcommittee for its efforts to keep the NLPES newsletter content relevant and responsive to the membership. Finally, I would like to thank the Awards Subcommittee and all of our volunteer judges for their efforts with this year’s awards cycle. Congratulations to all of the award winners! The official awards presentations will take place at the NLPES awards luncheon at the PDS.
The PDS program will once again include the popular Poster Session to feature the reports receiving this year’s Certificate of Impact Awards.
In other business, the Executive Committee continues to review the NLPES Bylaws for needed updates. We hope to have proposed changes announced to the membership in advance of the Executive Committee’s fall meeting.
If you missed my recent postings to the NLPES email list, I would like to congratulate Linda Triplett (Mississippi), Karen Leblanc (Louisiana) and Kevin Ryan (South Carolina) on their election to the Executive Committee. We had four seats up for election and three nominees, which means we had a vacant seat on the Executive Committee. Subsequent to our April meeting, in accordance with the NLPES Bylaws, the Executive Committee appointed Kristen Rottinghaus (Kansas) to fill the vacancy. All four members’ terms will begin this fall when the Executive Committee meets during the PDS.
Serving on the Executive Committee has been a very rewarding part of my career. I’ve been privileged to work with so many talented and capable individuals. My term as NLPES chair will end this fall at the PDS in Madison. I will, however, continue to serve on the Executive Committee under the leadership of our incoming chair, Linda Triplett. Although we still have one more meeting midsummer before their terms end, I would like to formally recognize three outgoing members of the Executive Committee: Wayne Kidd (Utah), Marcia Lindsay (South Carolina) and Katrin Osterhaus (Kansas) for their years of service to NLPES and our membership.
Once again, it’s been a privilege to serve as NLPES chair this past year. I hope to see you in Boston at the NCSL Legislative Summit in August or in Madison at the NLPES PDS in September!
Greg Fugate is the 2016–2017 NLPES Executive Committee chair. He can be reached at email@example.com.
The Wisconsin Legislative Audit Bureau is excited to welcome our colleagues from around the country to the 2017 professional development seminar (PDS)! The PDS will be held Sept. 17-20 in Madison, Wis. Planning for the seminar is ongoing, but the theme is A Capital Idea. While recognizing the “capitol” idea that resulted in the completion of the beautiful Wisconsin State Capitol building 100 years ago, the Capitol will serve as the perfect backdrop for seminar participants to learn, build, and launch a capital idea for personal professional development and ongoing improvements to their audit organizations.
The seminar agenda includes a great mix of plenary speakers and concurrent sessions with ample opportunity for panel presentations and discussion about contemporary issues in legislative program evaluation. Additional details will be forthcoming as specific sessions and presenters are confirmed and the opportunities for your involvement are solidified. On-site registration opens on Sept. 17. The PDS will begin at 8:30 am on Sept. 18 and end at 3:30 pm on Sept. 20. We anticipate participants will enjoy an interesting variety of topics and the opportunity to earn continuing professional education (CPE) credits during this full three-day seminar.
The seminar will be held at the Madison Concourse Hotel. Visit the seminar website for hotel rates and registration information. Additional information is available on the host state’s website. If you have logistical questions, please contact Brenda Erickson, NCSL liaison to NLPES, at 303-856-1391.
Optional Event (Sunday, Sept. 17)
The city of Madison is nestled amidst a chain of five beautiful lakes. The Capitol and downtown Madison sit on an isthmus between the two largest lakes: Lake Mendota and Lake Monona. On Sunday evening, Sept. 17, seminar participants and guests will have the option of registering in advance for an evening dinner cruise on Lake Monona. We will walk from the hotel to the boat dock and our cruise will depart at 6:15 pm and return at 8:15 pm. Enjoy first-class views of Madison's skyline, including the Capitol and Monona Terrace, which is the convention center designed by Frank Lloyd Wright. In addition to a cash bar, the cruise menu includes: assorted chips and dips, jumbo all beef hot dogs, “Build Your Own” Burgers, chicken breast sandwiches, German potato salad, pasta salad, fruit salad, assorted desserts, and free unlimited soda. The event is available for the first 72 attendees who register for the PDS and submit the individual fare of $38.50 per person.
Getting to and Around in Madison
Travel to Madison is particularly convenient, with many daily flights to the Dane County Regional Airport. Direct flights from 14 cities nationwide arrive daily. The seminar hotel has free shuttle service to and from this airport. Alternatively, Madison is a short 90 minutes from Milwaukee’s General Mitchell Airport or two hours from Chicago’s O’Hare Airport. Both Milwaukee and Chicago are connected to Madison by bus service at $60 roundtrip (www.coachusa.com/vangalder or www.badgerbus.com.)
There is no need to rent a car during your visit to Madison. The seminar hotel is conveniently located one block off the bustling Capitol Square and is adjacent to the many great restaurants and shops on Madison’s famous State Street.
Madison has been reported to have more restaurants per capita than any other U.S. city. In addition to classic American fare, Madison’s diverse dining scene has many ethnic restaurants and many food carts, which serve delicious cuisine from all around the world. Participants will have ample opportunities to sample food options from Madison on both Monday and Tuesday evenings during the PDS.
“While visiting Madison for the 2017 Professional Development Seminar, the one place you must eat is:
Anticipated Weather Conditions
Late summer/early fall is a beautiful time to visit Madison. The average high temperature in September is 71 and the average low temperature is 50.
Extending Your Time in Madison
Perhaps you will consider extending your visit by arriving early for the PDS or staying on afterward. Here are some options for pre- and post-seminar events and venues in Madison.
Dane County Farmers Market. If you arrive before 1 pm on Saturday, September 16, be sure to take in the 8-square block outdoor market on the Capitol grounds. There you will find the season’s best bounty of vegetables, flowers, meats, cheeses, and specialty products from approximately 150 vendors. This market is reported to be the largest producer-only farmers’ market in the country. All items are produced by the members behind the tables and no re-sale is allowed.
Madison Symphony Orchestra. On Sept. 15, 16 and 17, the Madison Symphony Orchestra will play works of Bach, Mendelssohn, and Berlioz in concert at the Overture Center for the Arts. The Overture Center, which is located just one block from the seminar hotel, opened in September 2004 as the result of a single $205 million gift from one donor. This is largest single gift to the arts in American history to date.
Wisconsin Veterans Museum. Located on the Capitol Square and adjacent to the seminar hotel is the Wisconsin Veterans Museum. The Wisconsin Veterans Museum is an educational activity of the Wisconsin Department of Veterans Affairs to commemorate, acknowledge, and affirm the role of Wisconsin citizens in American military history, past and present. The museum is open Tuesdays through Sundays.
Wisconsin Historical Museum. The museum, which is located on the Capitol Square adjacent to State Street, is a part of the Wisconsin Historical Society. Museum staff offer free tours to visitors at 1:00 daily. These short tours go beyond what is on display, offering facts and stories that you won’t find in the exhibits! Meet at the front desk around 1:00. Themes change monthly.
Explore State Street. State Street is a pedestrian zone located in downtown near the Capitol. The street extends from the Capitol westward to Lake Street, adjoining the campus of the University of Wisconsin–Madison at Library Mall. The many shops and restaurants along this corridor offer excellent opportunities for people watching. During the seminar, participants will explore State Street for dinner opportunities and, of course, souvenirs.
Henry Vilas Zoo. Located approximately 2 miles from the seminar hotel, the Henry Vilas Zoo is one of only a handful of admission-free, community-supported zoos in the country. Fully accredited by the Association of Zoos and Aquariums, the zoo features exhibits and attractions for the whole family. The Zoo is open daily 9:30 a.m.-5 p.m.
University of Wisconsin Arboretum. Occupying nearly 1,200 acres in central Madison, the Arboretum includes the oldest and most varied collection of restored ecological communities in the world. Among the tallgrass prairies, savannas, wetlands and several forest types, the Arboretum also houses flowering trees, shrubs and a world-famous lilac collection. Educational tours for groups and the general public, science and nature-based classes for all ages and abilities.
Take a Bike Ride. What better way to see the city than from the vantage point of a bicycle. Stop at any one of Madison's 39 locations to rent an urban B-cycle bike for an hour or a day. Trails in the Madison area are abundant and extend out into the Dane County countryside.
In addition to exploring Madison and all it has to offer, consider taking a short road trip to see some of these sights.
South: New Glarus. Located a short 45-minute drive south from Madison is the historic Swiss community of New Glarus. A fun way to see the rolling countryside that is truly Wisconsin.
West: Spring Green. Located a short hour drive west from Madison is the beautiful community of Spring Green. Spring Green is home to the American Players Theater (APT). Situated on 110 acres of hilly woods and meadows, APT has two theaters, the newly renovated 1,088-seat outdoor amphitheater and the 201-seat indoor Touchstone Theatre. From June through November, APT produces nine plays in rotating repertory. With annual attendance of over 100,000 and an annual budget in excess of $6 million, APT ranks as the country’s second largest outdoor theater devoted to the classics.
Spring Green is also home to Taliesin. Taliesin is acknowledged as the embodiment of American architect Frank Lloyd Wright’s commitment to the creation of exceptional environments that harmonize architecture, art, culture, and the land. In 2017, we observe the 150 years since the birth of Wright. All Taliesin Preservation public tours will be offered at half price in honor of Wright’s 150th. This includes everything from the one-hour tour of the Hillside Home School to the two-hour tour of the Taliesin residence to the four-hour tour of the entire Taliesin estate.
East: Milwaukee. Located just 90 minutes east of Madison, Milwaukee is Wisconsin’s largest city. From history museums, to shops, to restaurants, to beautiful lakeside views of Lake Michigan, Milwaukee is a fun stop for Wisconsin visitors.
North: Wisconsin Dells. Located one hour north of Madison, Wisconsin Dells is the waterpark capital of the world. Multiple venues feature both indoor and outdoor recreation opportunities for the whole family.
Last December, the Executive Committee asked for your help in identifying ways to improve NLPES’ newsletter, The Working Paper. Out of 437 NLPES members on the listserv, we received 126 responses to the online survey; thank you to everyone who took time to provide input. We are happy to report that for most members, the newsletter is successful at meeting their interests. Members also reported clear interest in three new columns, which are making their debut in this edition: Ask the Expert, Fun Field Work and New Technologies. Finally, although members are generally satisfied with the newsletter’s design and format, they did not indicate a clear preference regarding article length so we will continue to include both longer, more detailed articles as well as shorter ones. We are still considering other changes to the newsletter based on your feedback and will continue to keep everyone updated.
“The vex'd elm-heads are pale with the view, Of a mastering heaven utterly blue;
Swoll'n is the wind that in argent billows, Rolls across the labouring willows;
The chestnut-fans are loosely flirting, And bared is the aspen's silky skirting;
The sapphire pools are smit with white, And silver-shot with gusty light;
While the breeze by rank and measure, Paves the clouds on the swept azure.”
– Gerard Manley Hopkins, A Windy Day in Summer
Good day, fellow seekers. Summer is, indeed, upon us, and we shall trek across the country in search of the steamy prose, otherwise known as program evaluations. Featured in this edition are reports on higher education, the film industry and economic development.
First out of the gate is a report from the Golden State’s State Auditor on the University of California, Office of the President’s budget and staffing processes. Founded in 1868 as a land-grant institution, the University of California now has 10 campuses, five medical centers, and the system’s headquarters.
The report found the Office of the President had amassed more than $175 million in secret reserves. In fact, the reserves were not disclosed to the system’s Board of Regents. Further, the president’s office created a hidden budget to spend the money.
The Office of the President levies an annual assessment on all campuses to fund its operations and system-wide initiatives. Even though one-third of the office’s secret reserve fund came from unspent funds provided by the annual assessment, the office increased the assessment twice in four years.
If you are anything like your humble scribe, you may want to know about the systemwide initiatives. Just don’t ask the University of California System president. According to the report, the office was unable to provide a “complete listing of the initiatives it administers or their costs.” The initiatives were not often evaluated for their effectiveness. Read the report .
Over in the Centennial State, our brothers and sisters in the great state of Colorado, Office of the State Auditor, released an important report on their state’s Office of Film, Television, and Media. Created in 2012 to “expand and revitalize the film industry in Colorado,” the Film Office paid $10.6 million in incentives from 2013 through 2016. Productions included seven feature films, 13 television shows, six commercials, and one video game.
Unfortunately for the Film Office, the report found a lack of “complete and accurate information to assess and report on the effectiveness of its operations.” The findings only get worse from there. According to the report, the Film Office paid about $1.9 million in incentives for projects in their sample even though none of the projects met all of the requirements. Of the $1.9 million, $129,000 was paid out for projects that flat out didn’t qualify and another $1.8 million for projects that lacked documentation to determine if they qualified.
It gets better. The Colorado Film Office paid out $1.9 million in incentives for projects that lacked a contract when the project began. Of that, $1.3 million was for projects that never had a contract or purchase order. According to the Coloradoan, the report has forced the Colorado Office of Economic Development and International Trade, which oversees the film office, to “require advance documentation from companies before paying incentives, ensure that projects are completed in a timely manner; and strengthen its collection of jobs and tax revenue data. Read the report.
Finally, dear reader, we bring to you an interesting report from the Badger State and Wisconsin’s Legislative Audit Bureau on the Wisconsin Economic Development Corporation (WEDC). Released in May of this year, the report analyzes WEDC’s administration and oversight of its programs and results achieved.
The WEDC is Wisconsin’s lead economic development entity. The corporation is governed by a 14-member board and receives most of its funds from the state, although it is not a state agency. In fiscal year 2015-16, WEDC administered 34 economic development programs that provided grants, loans, tax credits, and other assistance. The Wisconsin statute creating WEDC requires the Legislative Audit Bureau to conduct biannual financial and program evaluation audits of the corporation.
The Legislative Audit Bureau found that WEDC didn’t require grant and loan recipients to submit information in order to determine how many jobs were created. That lack of verification meant that WEDC was not in compliance with Wisconsin statutes that require annual verification of jobs-related information. In the end, it’s results that matter. Unfortunately, the audit found that WEDC’s online data was inaccurate. For example, the data included 1,265 jobs that were associated with recipients that sold their Wisconsin operations, terminated their operations, or withdrew from their contract. At least 699 jobs were counted twice in the online information.
Wisconsin’s Legislative Audit Bureau recommended that WEDC improve their administration of grant, loan, and tax credit programs; oversight of economic development programs; and financial management. Read the report.
If there’s a report you feel merits inclusion in the next edition of Report Radar, please feel free to send it my way. Until then, have a great summer and don’t forget to use sunblock!
Chris Latta is a project manager with Pennsylvania’s Legislative Budget and Finance Committee. He can be reached at firstname.lastname@example.org.
This column is a new addition to the newsletter as a way for our members to reach out to senior level professionals in the performance audit/program evaluation community and ask questions. Our expert for this issue is Rakesh Mohan, director of Idaho’s Office of Performance Evaluations (OPE). Rakesh’s response was excerpted from articles he wrote for blogs run by Ann K. Emery and Stephanie Evergreen.
Q. As evaluators, who are your target audiences?
A. OPE targets seven different audiences. In addition to the oversight committee, OPE’s audience has to include the public and the press in order to achieve the office’s mission to promote confidence and accountability in state government. To produce useful evaluations, we have to think about the various stakeholders, such as policymakers, agency and program officials, people who are directly affected by the program or the policy, and lobbyists representing different interest groups who have a stake in the evaluation. There is one more, a kind of latent audience: our evaluation colleagues and peers. We always want to know what they think about our work, because this helps us gauge our professional credibility.
Q. Some audiences may not have enough time to read your evaluations critically. How do you reconcile writing in-depth, impactful reports, while keeping them accessible enough for your audiences to understand easily?
A. OPE’s audiences have varying levels of interest in our evaluations depending on their role and stakes in the evaluation. Not everyone is interested in technical details, although those details are the foundation of our evaluation work. In fact, 5 of the 7 audiences identified are interested in non-technical information.
To meet these varied needs, OPE prepares different products to disseminate evaluation results. Based on experience, the more technical the product is, the less reach it has in terms of the number of people it connects with. For example, technical appendices of an evaluation report will be of interest to only a small number of people compared to the press release that is distributed to all media outlets.
Q. What other methods do you employ to help deliver complex information to your target audiences?
A. We have added three other methods to extend our reach to a larger audience and to effectively communicate our evaluation message.
For more great insight, see “Rakesh Mohan on Getting the Attention of Your Audiences” and “Embracing Data Visualization in Evaluation: A Management Perspective”.
Do you have a question you would like addressed by an expert? Let us know by sending questions to email@example.com.
This column is a new addition to the newsletter designed to highlight fun or unique fieldwork opportunities performance audit/program evaluation offices sometimes participate in to help understand agency programs and operations.
Everyone knows everything's bigger in Texas. But in case you haven't heard, in the Lone Star State, there's nothing bigger than high school football.
So, when the Texas Legislature charged the Sunset Commission with reviewing the state's interscholastic activities association—the University Interscholastic League, or UIL—it was clear the state high school football championships would be the event the review team would need to attend. And not just because the event was held at AT&T Stadium (a.k.a Cowboys Stadium or Jerry’s World) and the review team members happen to be Dallas Cowboys fans, but because it brought together all the UIL stakeholders under one roof and provided the best opportunity to observe how UIL runs its events.
Although UIL administers various athletic, academic, and music contests for primary and secondary students, the signature event is a series of 10 high school football championships held at a single venue over a weekend in December. The event had recently become a source of controversy because it had been held at AT&T Stadium in Arlington for several years and other cities were pining to host the event, which draws crowds of over 200,000 for the weekend, contributes to the economic development of the host city, and gives local teams a home field advantage. UIL maintained no other stadium had adequate capacity, facilities, and amenities needed to host the championship games. So, as part of its field work, the review team got a backstage look at the stadium and its facilities, shadowed and interviewed UIL staff as they coordinated all aspects of the event, and met with numerous parents, coaches, athletic directors, and superintendents
The fieldwork proved vital to the review team’s understanding of and ability to answer questions about the championships, and how UIL manages the event and selects the venue. For example, the importance of a facility having certain amenities, like four locker rooms to handle so many players, became apparent as teams rotated in and out. Perhaps the most unexpected experience the review team had was the endless number of people commenting on the intangibles associated with the stadium. After all, getting to play in a championship game in the same stadium as “America’s Team” is the highlight of every football player’s high school career.
Ultimately, the review team was satisfied with the method by which UIL chose the venue for the state football championships and, therefore, made no recommendations related to the location and selection process — a decision based, in part, on observing the inner workings of the event. But the additional attention from Sunset did contribute to UIL recognizing the value of rotating this and other UIL event venues around the state.
Emily Johnson is a policy analyst with the Texas Sunset Advisory Commission. She can be reached at Emily.Johnson@texas.sunset.gov. If you have a fieldwork experience your office would like to highlight, please email submissions to firstname.lastname@example.org.
This column is a new addition to the newsletter featuring product trends that can be useful in improving our products and productivity.
Qualitative Data Software Can Make Audit Work More Efficient and Productive | Josh Rueschhoff (Kansas)
Performance auditors regularly deal with qualitative data. From laws and regulations to interview notes and open-ended survey responses, the amount of written data gathered for audits can be daunting and presents challenges for analysis and documentation. Kansas Legislative Division of Post Audit deals with these same challenges and recently started using qualitative data analysis (QDA) software to address them.
QDA software is designed to deal with unstructured, qualitative data. Because of the nature of this data, QDA software typically works with multiple media types, such as typed interview notes or video files, and has tools to provide easy coding for content analysis. During coding, individual pieces of data, such as a sentence, a piece of an image, or a phrase from an interview, are gathered together in an organized fashion around codes, the thematic
ideas found in the data. Organizing coding can be difficult from an operational perspective. Previously, auditors likely used highlighting, document hyperlinks, spreadsheets, or another method to document coding and build evidence. These methods can be time consuming and produce greater risk as spreadsheets grow and more links are made.
The QDA software that we have started using to tackle this challenge is NVivo from QSR International. The first performance audit we used this software on examined federal funds used by school districts in Kansas. We started with NVivo 11 Pro to deal with large volumes of open-ended survey comments given by teachers and school administrators. We read the comments and found that federal funds were used to provide many different services like school lunches and reading interventions. These responses were coded by hand to a ‘Services’ code. We broke this code down into sub-codes such as Title I services, lunches, and other concepts; then, we queried our coding to see how many respondents mentioned a specific service. Meanwhile, NVivo automatically created reference links to the comments in the dataset.
We later upgraded to NVivo 11 Plus after learning more about its additional automated coding features for handling large volumes of qualitative data. One feature creates codes based on themes running through written data. Another analyzes written responses to determine whether a respondent’s sentiments were positive or negative. An interesting feature, though still experimental, allows an auditor to code a subset of open-ended survey responses and have the program finish the rest based on the coding patterns the auditor provided. A very useful feature I like, available in both the Pro and Plus editions, imports interview notes and automatically codes responses based on the structure of the interview document. An auditor can then compare all the respondents’ comments side by side instead of copying responses from separate documents or opening multiple documents for comparison. While they don’t eliminate the need to review coding, these tools are terrific for providing a quick analysis of large datasets for an auditor to later refine. It is easy to remove incorrectly coded references or delete irrelevant categories using simple drag-and-drop procedures.
With automated features and hands-on tools, NVivo works well with small groups of documents as well as large datasets of open-ended responses; and these are only a few of its features. NVivo can also code other data types like video and audio files, build summary matrices, and create charts among many other features. At first, I did not know what to expect, but as I’ve worked with the program, I have found it to be useful in several settings. By continuing to work with NVivo and applying some resourcefulness and creativity, I can see it providing richer analysis while also speeding up documentation.
Josh Rueschhoff is an Associate Auditor with the Kansas Legislative Division of Post Audit. He can be reached at Josh.Rueschhoff@lpa.ks.gov.
This profile of the 2017 PDS host state was included in the Fall 2016 edition of The Working Paper. In anticipation of the upcoming PDS, excerpts are reprinted below. The full article can be viewed here.
About Us: The Legislative Audit Bureau is a nonpartisan legislative service agency created to assist the Legislature in maintaining effective oversight of state operations. The Bureau’s 87 staff conduct objective financial audits and performance evaluations of state agency operations to ensure financial transactions have been made in a legal and proper manner and to determine whether programs are administered effectively, efficiently, and in accordance with the policies of the Legislature and the Governor. The results of the Bureau’s work, including recommendations for improvements in agency operations, are addressed to the 10-member Joint Legislative Audit Committee and provided to the 132 members of the Legislature. The bureau is strictly nonpartisan.
The head of the bureau is the Wisconsin state auditor, who is appointed by and serves at the pleasure of the Joint Committee on Legislative Organization. The current state auditor, Joe Chrisman, is only the fourth individual to serve in this capacity in the bureau’s 50-year history.
The bureau has recently received Certificate of Impact awards from NLPES for its evaluations of the Government Accountability Board, Supervised Release Program Expenditures, and the Wisconsin Economic Development Corporation.
New Initiatives: In 2016, the Bureau celebrated its 50th year of service to the Wisconsin Legislature by hosting a two-day professional development conference for all staff (at which Brenda Erickson graciously shared her NCSL wisdom), launching a newly designed website, joining the Twitterverse (@WILEGAUDIT) to announce report releases, introducing a new one-page Briefing Sheet navigational tool to accompany each report, and converting to electronic publication of all reports.
We look forward to welcoming you all to Wisconsin in September 2017! Come and learn along with us!
Joe Chrisman is Wisconsin’s State Auditor. He can be reached at email@example.com. Photo: Wisconsin Legislative Audit Bureau, 50th Anniversary.
This article is intended as a very brief, necessarily allusive guide to a class of statistical methods: permutation tests. While the theory behind permutation testing is old (see Fisher, R. A. (1935). The Design of Experiments. Oliver and Boyd, Edinburgh), in practice this class of tests has seen much less use than the parametric methods we probably recognize from stats class. I’m going to argue here that permutation methods are more apt than many parametric tests for the purposes we often face in public policy evaluation, and are quite easily performed and interpreted.
So let’s start with some background. Permutation tests are a class of methods for hypothesis testing which construct, rather than assuming, their null distribution. The theory behind them is extremely intuitive, but perhaps most easily illustrated with a preliminary contrast.
Parametric tests—such as the t-test—work because they compare an observed result to a theoretical probability distribution that we believe obtains under the null hypothesis. We ask how probable the observed result is, assuming a random draw from that distribution, and that gives us the classic p-value—interpreted as the probability that a random sample from the null distribution produces a result of equal or greater extremity than the observed result.
The question, of course, is why we believe our theoretical distribution obtains. In cases of true random sampling, the question is easily answered; thanks to the central limit theorem, the means of a sufficiently large sample of variables will necessarily be approximately normally distributed, regardless of the distribution governing the original population.
So much for what I consider to be the status quo. But now consider: Much of what we find ourselves doing as public policy analysts renders both the above methods and the answers they give of dubious utility.
Let’s imagine the sort of assignment that happens often in my agency. We’ve been tasked with a summative assessment of a public program – let’s say this program is designed to improve dental hygiene outcomes in its participants, as measured by scores on a five-point ratio-scale dental evaluation. The program is finished, and we have data on all its participants and an equivalent control group; our task is not causal generalization about the program in the abstract, but evaluation of the specific instantiation of the program at a single place and time. We might just straightforwardly compare mean dental evaluation scores between the program and control groups. But of course, we want to know not just whether the groups are different, but whether their difference is noisy or causal.
We might think this is a classic job for a t-test, but there are problems with that strategy in this case. Notably, the assumptions of the t-test might not be met. (Note: Obviously this is a problem with any parametric test, and it may not even be a serious problem with a robust one. It’s worth noting, though, that permutation tests can be used in the place of tests far less robust than the t-test, in which cases the argument for using them would be even stronger.) The sample size is dictated by the environment, not the researcher, in the dental hygiene case, and might provide too much or too little statistical power. And, though it’s perhaps not as obvious, the inference strictly licensed by the t-test isn’t relevant to the case at hand. (Note: It’s a non sequitur in this context to discuss the odds of a random sample resulting in the observed difference of means, since no random sample was ever taken. Said differently, in the case described, we’re not generalizing from a sample to a population; we’re talking about the behavior of the actual members of the population itself.)
Let’s further assume for the sake of the example that our dental hygiene program was a small pilot, with only two completers, and we have three people in our control group. Under these conditions, doing a t-test starts to look extremely inappropriate: We have an underpowered test, thanks to our tiny sample, with dubious assumptions we can’t even effectively check (again, thanks to sample size and lack of randomization), delivering an inference that doesn’t meaningfully apply to the case at hand!
So let’s try something different for our dental hygiene evaluation. First, an observation: Under the null hypothesis, the scores of the treatment and control groups may be thought of as assigned at random, by definition. As such, we can construct a null distribution by actually performing this random reassignment of values. The number of random reassignments of observed values is equal to the factorial of the observed values; in the case of our small dental-hygiene sample, there are 5! or 120 such reassignments (including the actual observed case).
For simplicity let’s assume that our experimental group had scores of 5 and 3 on the dental-hygiene evaluation; our control group had scores of 1, 2, and 4. As such, the experimental group has mean 4, and the control mean 2.33, for a difference in means of 1.67 units.
When we randomly reassign scores as described above, we find that there are 24 / 120 cases in which the difference in means is equal to or greater than the observed difference of 1.67 units. In other words, if we were randomly assigning observed scores to participants, we’d equal or exceed the observed difference 20 percent of the time. This is an easy calculation, but we’ve just provided an English-language description of the results of a one-tailed significance test! This procedure is just a permutation test of means.
Notice what we’ve just achieved: A nonparametric, exact test of the null hypothesis using a sample size of five, providing relevant inferences from nonrandom samples or even entire populations. There’s much more to be said about the applicability of permutation tests to situations more complicated than the cartoon I’ve sketched here; for any standard statistical test, there is a permutation version.
But for the time being, let’s just say this: Given that public policy research frequently presents us with nonrandom or odd-sized samples, whole populations, and unknown distributions, and is often more concerned with evaluation than causal generalization, permutation tests deserve a second look as being particularly well-suited to the task.
Kirby Arinder, Ph.D., serves as research methodologist for the Mississippi Joint Legislative PEER Committee. He can be reached at firstname.lastname@example.org.
Congratulations to the 2017 NLPES Award recipients! NLPES awards recognize exceptional performance among our offices. This year’s award winners are:
For a complete list of award winners and award winning reports, visit the NLPES awards webpage.
Special thanks to the Awards Subcommittee (Jon Courtney, Melinda Hamilton, and Marcia Lindsey) and all the judges for your hard work during this awards season!
North Carolina’s Program Evaluation Division is pleased to announce the addition of two new evaluators.
Georgia’s Performance Audit Division welcomes new management analyst, Jonathan Wilson. Jonathan is a recent graduate of the University of Georgia where he earned a B.A. in Political Science and Spanish and an MPA. We look forward to his contribution!
Please let us know if you have staff happenings to share! E-mail email@example.com.
Even before the 2017 summer pool season was in full swing, our member offices had already made a media splash. Both print and video media have helped spread the word about the results our offices reported. Check out some of the coverage linked below from October 2016 through mid-May 2017. (An aside: although I typically avoid including links to subscription sites or sites requiring a survey response, a few slipped through this time.)
An on a personal note … my upcoming retirement before our next newsletter is due means that this will be my last time authoring “Stop the Presses!” If you would like to assume the byline, please let Shunti Taylor know; her email address is at the bottom of the article. I’ve had great fun writing this article for The Working Paper since the Fall 2015 edition. The wide variety of topics on which our member offices reported and the work our offices accomplished was amazing. Our member offices’ effort to effect positive change in public agencies is, as it ever was, commendable. Adieu!
Share your coverage with us! If you would like us to highlight media attention about your reports in our next newsletter, send the hyperlinks to firstname.lastname@example.org.
Also, we hope you like the new format for “Stop the Presses”. It’s more concise, which means you get to find more quickly the links you’re interested in and it allows us to include links to more media coverage than we could under the old format. If you have an opinion about either the new or the old format, let us know!
The Professional Development Subcommittee of NLPES provides you with opportunities to enhance your audit, evaluation, and management skills through:
Professional Development Resources Page
The Professional Development Resources page provides you with access to archived PowerPoint presentations and webinars on the following topics relevant to our work as legislative program evaluators and performance auditors: Planning and Scoping, Fieldwork, Writing and Publication, Management, and Evidence-Based Policies and Programs. The Professional Development Subcommittee reviews this page annually to ensure that the materials presented remain useful and relevant.
The Subcommittee continuously strives to find additional training materials for our members. While planning and scoping is a critical component of our work, there is only one presentation available for this topic. If anyone has created a PowerPoint focusing on the steps involved in background research and scoping and their importance to the development of a project plan, or other planning-related presentations, consider submitting your material to Linda Triplett, chair of the subcommittee, for possible inclusion in the Planning and Scoping section of the Professional Development Resources page.
The development and presentation of webinars are made possible through generous funding and technical support made available through NCSL’s E-Learning Project. Since the last issue of the Working Paper, the Professional Development Subcommittee has developed and presented two webinars, both of which are available on the NLPES Professional Development Resources page.
In December of 2016, Katrin Osterhaus, IT Audit Manager for the Kansas Legislative Division of Post Audit, moderated a panel on “Confidential Data Access Issues for Auditors and Evaluators.” During this webinar, three panelists shared their experiences, issues, and solutions for accessing sensitive or confidential data: Ted Booth, General Counsel, Mississippi Joint Legislative Performance Evaluation and Expenditure Review Committee; Dr. Jon Courtney, Program Evaluator Manager, New Mexico Legislative Finance Committee; and Justin Stowe, Deputy Post Auditor, Kansas Legislative Division of Post Audit.
In May of 2017, Linda Triplett, Director of the Mississippi PEER Committee’s Performance Accountability Office, moderated a panel on “Outstanding Research Methods,” highlighting the methods used in the two reports that won NLPES Excellence in Research Methods Awards in 2016:
The following panelists explained their methods through PowerPoint presentations and answered questions from the viewing audience: Edward Seyler, Senior Research Methodologist, and Christina Wilson, Senior Auditor from the Louisiana Legislative Auditor and Christopher Harless, Performance Audit Supervisor and Torry van Slyke, Legislative Audit Supervisor from the Colorado Office of the State Auditor.
During the webinar, the panelists noted that their award-winning methods have a broader range of applicability than the specific topics discussed during their presentations. For example, Ed Seyler noted that the method that they used to simulate the effect of changes in prize payout percentages on lottery ticket sales could be applied to estimate how a change in policy will affect outcomes in a variety of settings, including economic, crime, health care, education, and environmental programs.
Following an office-wide viewing of the Outstanding Research Methods webinar, the Wisconsin Legislative Audit Bureau reported that they had a lively discussion of the methods presented. This is an excellent example of maximizing the training value of an NLPES webinar.
Professional Development Seminar
The Professional Development Subcommittee, along with input from the full Executive Committee, helps the host state shape the content of our annual seminars by providing suggestions, feedback on their program ideas, and lessons learned from prior events. Thanks to the outstanding initiative of host states in developing their own PDS theme and program content, they have required minimal assistance from the Professional Development Subcommittee in recent years. This year’s PDS in Madison, Wisconsin will again be offering a variety of exciting training opportunities. More details are available here.
If your office has other training presentations that you think would benefit our membership, please consider sharing them with the Professional Development Subcommittee to review and possibly include in our Resources page. Also, the Professional Development Subcommittee welcomes your suggestions for future webinars or podcasts. Please forward all inquiries or suggestions to Linda Triplett.
The Executive Committee met on April 22, 2017, in Madison, WI. The meeting took place at the Madison Concourse Hotel, the location of the 2017 NLPES Professional Development Seminar. The meeting covered elections, bylaws, subcommittee work, and other business. In addition, Joe Chrisman (WI State Auditor) and Dean Swenson provided the Executive Committee an update on the 2017 PDS. The meeting minutes have been approved and are available here.
NLPES website—Spend a few moments touring our NLPES website to learn more about NLPES and see what we do. You’ll find general information about NLPES, including our by-laws, executive committee membership and subcommittees, state contacts, awards, and information on peer reviews. We also have a training library and resources including past meeting minutes, newsletters, and more. Check it out!
NLPES listserv—The NLPES listserv is an email discussion group for NLPES members. By sending a message to email@example.com, you can reach all listserv subscribers simultaneously. Listserv members can query other states about evaluation work similar to their own projects, receive announcements about performance evaluation reports and job opportunities from other states, and are notified when the latest edition of this newsletter is available! To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv. See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.
Are you receiving our listserv emails? Some states’ systems block NLPES listserv emails. If you think you are not receiving our emails, please check your state’s security system and spam filters, and/or contact Brenda Erickson.
Legislative careers website—Know someone thinking about pursuing a career with a state legislature? Point them to the opportunities posted on NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs. Attracting young people to work as legislative staff will be increasingly important in the coming years. And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career.
NLPES’ Professional Development Resources—Visit our NLPES online training library for a variety of refresher and training materials! There are nearly two dozen resources on planning and scoping, fieldwork, writing and publication, and management topics. Most are PowerPoint slides; some are narrated; a few are webinars or podcasts. Check them out.
JLARC presentation on web reporting—In 2014, the State of Washington’s Joint Legislative Audit and Review Committee (JLARC) began issuing audit reports as web pages rather than PDF documents. Reports are now readily accessible on computers, mobile devices, and tablets. JLARC has received favorable comments on the change from legislators, legislative and executive branch staff, and the public. Writing for the web requires a change in perspective, in addition to changes in technology. Visit JLARC’s website to see samples of their web reports. Contact JLARC Audit Coordinator Valerie Whitener with questions.
Ask GAO Live—AskGAOLive is a 30-minute interface where GAO staff chat about a specific report and research, and answer questions that are emailed or tweeted in. Sessions are recorded and archived on the website. You can also “follow” GAOLive to receive advance notice of chat sessions. Topics include veterans and higher education, prescription drug shortages, prison overcrowding, state and local fiscal outlook, and government contracting.
Ensuring the Public Trust—What’s the most common internal performance measure for evaluation shops? How many offices tweet? What percentage of staff has fewer than 10 years of experience? How can you contact a sister office in another state? Ensuring the Public Trust, summarizes information about legislative offices conducting program evaluations, policy analyses, and performance audits across the country.
The Working Paper is published two times a year by the National Legislative Program Evaluation Society, a staff section of the National Conference of State Legislatures. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
The Working Paper is produced by the NLPES Communications Subcommittee.
Patricia Berger, 2016-2017 chair (PA)
Shunti Taylor, newsletter editor (GA)
Emily Johnson (TX)
NCSL Liaison to NLPES
Brenda Erickson, 303-856-1391
NCSL Denver Office, 303-364-7700