Chair's Corner | Jon Courtney (New Mexico)
Like many of you…I grew up wanting to be a legislative program evaluator!
Those of you who attended the awards luncheon at NLPES this year already heard me tell that joke, but it illustrates an interesting point. Most of us found this profession well into our careers. For me, it was when two evaluators from the unit I now lead interviewed me for an evaluation when I was working at New Mexico’s child welfare agency. I learned that their job was to conduct research projects (program evaluations) with the goal of improving the efficiency and effectiveness of government, and I was hooked. I was in my early 30s at the time and would spend the next decade (and counting) as a program evaluator. I am guessing that many of your evaluation/audit careers have similar origin stories. Before program evaluation, I worked as a college instructor, a research scientist and a statistician. Program evaluation wrapped up all of these things in one and I love it. And the bonus is what we do matters!
If you are anything like me, you are finding this field it to be challenging, rewarding and, most importantly, impactful. That’s part of the reason we love it so much. The work you do impacts thousands if not millions of lives. Think about that. Why wouldn’t we grow up wanting to do this?
As a field, we are growing.
And more people are joining our field. Our annual professional development seminar (PDS) this year had the highest participation of any in recent history with 199 attendees from 37 states. I think we can attribute part of this resurgence in attendance to a strengthening economy (I didn’t plot 2009 because the PDS was canceled that year due to a widespread economic downturn) but also that new offices are also coming online, with two new offices in Oklahoma (Oklahoma Legislative Office of Fiscal Transparency) and Maryland (Maryland Office of Program Evaluation and Government Accountability). Our current NLPES membership is at 733 strong and I expect this number will continue to grow in the coming years.
However, even though we are growing, sometimes I feel like our profession is one of the best-kept secrets out there. So, is it a reasonable expectation that the next generation would grow up wanting to do this for a living? Maybe, but probably not. However, I think it is a reasonable expectation for us to get to the point where we are sharing our good work in new ways and attracting members that set out to be evaluators/auditors since they were in college. This is just one of many ways we can advance our field.
I am asking that we all examine how we can work toward advancing our field.
And that’s not just because that is our society’s first listed purpose in the NLPES bylaws. Now, what does advancing the field mean for NLPES and the executive committee beyond growth in membership? The executive committee is moving toward advancing the field in a number of ways:
- Becoming more data-driven by using performance measures to inform activities (e.g. website hits, newsletter hits, outreach calls made).
- Continuing to find ways to engage with you to share opportunities we are offering, such as the PDS, webinars and the peer-review process.
- Sharing stories about successfully advancing the field in our home offices, and looking to engage partners that bring new opportunities to the table.
We are already making good progress on these activities. Our NCSL hero Brenda Erickson (you may know her from various emails or from the registration desk at the PDS) has helped us to secure a number of data points, and subcommittee chairs are developing performance measures that will better inform executive committee activities. We are also finalizing work on an NLPES Report Library, which should be an excellent resource for those starting new projects. Regarding working toward advancing the field at our office levels, I have been thinking about stories that we can share from my day to day work where we are seeing advancements in the field locally. To that end, please see the column regarding how New Mexico’s Legislative Finance Committee has been working with the University of New Mexico’s Master of Public Policy program to help strengthen both their and our capacity, and to help build a pipeline of students who are interested in working in public policy in our state.
I am also asking that each of you continue to get involved.
Our PDS numbers are up and that is superb, but more than three-fourths of you did not attend the PDS last year and the PDS is only one way to share our work and learn from each other. I’m asking you to think about what you are doing in your office or what opportunities there might be to help us move our field forward. How are you working with local universities to improve your operations, how are you sharing the good work happening in your office with others (is anyone publishing the results of studies in peer-reviewed journals; if so, are you finding that to be a worthwhile endeavor?), how have you partnered with other entities to improve your methods? Consider sharing those experiences on the listserv, in a webinar or even in this newsletter (our outstanding newsletter lead Emily Johnson would greatly appreciate the content). Are there other opportunities to network in your state?
By the way, I actually grew up wanting to be a criminal prosecutor. I think I got lucky!
Jon Courtney is the 2019–2020 NLPES Executive Committee chair. He can be reached by email or 505-986-4539.
Fun Fieldwork
The following does not fit our typical Fun Fieldwork column, but does showcase the lighter side of our work. If you have a (more traditional) fieldwork experience your office would like to highlight in the future, please email submissions to Emily Johnson.
The Trials and Tribulations of an Olympic Bobsled Run
World-class athletes spend their whole lives preparing for a few minutes of Olympic glory. In my case, I won my Olympic bid by having my name pulled out of a bowl at the New Orleans NLPES Professional Development Seminar (PDS), spent the next year not preparing at all, and then found myself atop the bobsled run at Utah Olympic Park while attending the Park City PDS.
But maybe my evaluation work had been preparing me all along…
Like the planning and scoping phase of a project, our little team of Pat Berger from Pennsylvania's Legislative Budget and Finance Committee, Liz Thomas from Washington's Joint Legislative Audit and Review Committee, and me looked at each other with fear and excitement as we launched into this new endeavor together. The only guidance we received was to sit up tall, hold our elbows out and relax our necks.
The ride was like the fieldwork phase. The first three turns were manageable, but then the fourth turn threw me for a loop. The gravity made my head snap down, and it took all my effort to get it back upright. I didn't realize how hard I was going to have to work. On the next tough turn, my head hit the side of the bobsled, but I was able to bounce back more quickly this time because I had learned and adjusted. At the end of the ride, the team was super excited we had succeeded — 71.6 mph, 0.8 miles, 1.06 minutes, 4 Gs. It's like when you finish the outline and think the hard work is behind you.
But then the writing phase hits you, or in my limited bobsled experience, the world would not stop spinning for 20 minutes. I ended up throwing up in the Olympic Park parking lot, which is not unlike how I feel when I have to present to our oversight committee. It was not fun, but I felt much better when it was over. The next morning, I woke up with bruises on my shoulders and arms. Like every evaluation, I was a little worse for the wear, but I had a great story to tell.
Kiernan McGorty is a principal program evaluator with the North Carolina Program Evaluation Division of the General Assembly. She can be reached by email or (919) 301-1393.
Research and Methodology
Defining a “Significantly Statistically Abnormal” Prescribing Pattern
In 2018, a new Tennessee law required the Comptroller’s Office to complete a study of the state’s opioid prescribers. Our office was specifically asked to determine the number of prescribers in Tennessee whose prescribing patterns were “significantly statistically abnormal” and to investigate what disciplinary responses, if any, were taken by the licensing boards in response.
As we began work on the study, we took special note of the phrase “significantly statistically abnormal” because we weren’t able to find a clear definition for it. This situation is both exciting and nerve-wracking. It gave us considerable latitude in defining the term ourselves, but it also left room for error in interpreting legislative intent. We wanted to design a research methodology to identify “significantly statistically abnormal” prescribing patterns that yielded the most accurate analysis while also aligning with what lawmakers were interested in finding out. In this particular case, the legislator responsible for the study was the speaker of the Senate, so it was pretty important to understand his intentions!
We sought clarity about what could be considered a “significantly statistically abnormal” prescribing pattern in many places, but the first clues came from the sections in state law surrounding the study request. Upon the completion of our study, the law required the Tennessee Department of Health to complete a similar study that will identify “prescribers whose prescribing patterns… represent statistical outliers.” Based on this phrasing, we chose to define “significantly statistically abnormal” as statistical outliers.
For those of you who have a statistics background, however, you may know that the phrase “statistical outlier” isn’t as defined as one would hope. As we searched for the best way to measure statistical outliers, we discovered that every test had its merits and drawbacks, and that those in the statistics world love to debate the best methods. Of course, as is often the case in the social sciences, the only consistent answer was: you’ll need to use your judgment.
In the end, all the searching led us to the age-old adage of knowing your audience. What did the audience (i.e., the speaker of the Senate and the Tennessee General Assembly) really want to know? It was time to ask.
As we discussed the study with the speaker, we realized there was a back story: A health-related board had made a disciplinary decision that the General Assembly, among others, believed to be too lenient. It turned out that identifying “significantly statistically abnormal” prescribing patterns was simply a means to an end. We were not to create a comprehensive list of all statistical outliers, but to see if prescribers who were obviously outside the norm had been disciplined, or even investigated.
With this realization in mind, we chose to use the Tukey box plot method and adjusted box plot method as our statistical outlier tests. Although basic methods, they could be easily explained to our audience and provided a baseline for what could be considered “significantly statistically abnormal.” We had over 30,000 records, and after using the outlier tests for some metrics, we were able to identify a few obvious outliers (as shown in the exhibit). For other metrics, though, we found hundreds of outliers. Our goal was to find the prescribers obviously outside the norm, and to investigate, thoroughly, what actions were taken against those prescribers. We couldn’t do that for hundreds of cases and had to narrow our focus. For these metrics, we decided to investigate only the most extreme outliers. We used our judgment (following the advice of those statistics debaters) to select a cut-off that narrowed the field to under 20 prescribers per metric.
Using this process, we selected 62 prescribers and investigated what disciplinary actions had been taken in response to the prescribing patterns we identified. When looking at the 62 cases, we found notable variance based on prescriber type and prescriber pattern in the disciplinary process and results for each case. We were able to identify gaps in the monitoring of certain prescribing patterns and provide the Tennessee General Assembly with an overview of how prescribing cases generally go through the disciplinary process.
As a numbers person, I wanted to find the most statistically sophisticated definition for “significantly statistically abnormal” prescribing patterns, but realized that the simple box plot methods were the right tools for this particular study. In the end, every prescriber we selected represented a significant deviation from the norm, and their cases provided us with the information necessary to identify areas of potential improvement to the prescriber discipline process in Tennessee.
To read the full report and one-page snapshot about this topic, here.
Kristina Podesta is a legislative research analyst with the Tennessee Comptroller’s Office of Research and Education Accountability (OREA). She can be reached by email or 615-747-8795.
New Technologies
What do Charlie the Unicorn, Gangnam Style and JLARC Reports Have in Common?
The Washington State Joint Legislative Audit and Review Committee’s (JLARC) office philosophy is that if our reports are not reaching our audience, they are not valuable. Our audience of busy legislators and stakeholders increasingly prefer to receive and share information in nonprint formats. Over time, we’ve adapted our communication products, shifting from PDF reports to interactive webpages and one-page overviews. We’ve heard from legislators and stakeholders that these media quickly communicate key information and actionable recommendations.
We also noticed that some members frequently produce short videos to highlight policy topics and news. Since members already communicate with their constituents via video, we decided to see if our reports could be shared the same way. This led us into a world dominated by cute animals, makeup tutorials and music videos: YouTube.
We experimented with creating video summaries for our annual tax preference performance reviews. None of the JLARC staff has a background in film, and our office doesn’t own fancy camera equipment. Despite this, one of JLARC’s talented administrative assistants was eager to learn new technologies and apply her graphic design expertise to a new medium. She brought in her own camera and set up a filming station in an empty conference room with natural light and a neutral background. JLARC staff wrote brief scripts to highlight report findings and recommendations and presented them to the camera. The final videos combine talking heads and animated graphics from our web reports. We loaded the videos to our YouTube channel and embedded a link in our reports and presentations.
The pilot videos were so successful, they are now a regular part of the production process for each JLARC report. Members and stakeholders have referenced our videos during legislative committee meetings and hearings, and agency staff have expressed positive feedback. To date, we have produced videos for 13 of our studies and tax preference reviews. Depending on the complexity of the report and the number of messages, the YouTube videos range from 90 seconds to just under five minutes.
Our studies can take anywhere from six to 18 months from start to finish. We estimate that each minute of video takes about five hours of staff time, from script production and filming to video editing and graphics work. While this is a significant commitment, it gives us another way to communicate our work to the legislature, and helps communicate the work developed over months of fieldwork, analysis and writing. Our report analytics indicate that our pilot video was viewed over 120 times in the month after it was released. For comparison, the corresponding web report was viewed 250 times.
We learned a few things that may be helpful for other offices considering videos:
- Use the executive summary as a template for the script, as it already identifies key points.
- Once you’ve drafted a script, read it out loud. Does it sound conversational, or like a lecture?
- Record a practice run of the script read-through and share it with your supervisors for approval before you film the final version.
- Don’t proceed with final recording until you feel you can “commit to the script.” Unlike a written report, it is challenging to change a word or phrase in a video without needing to redo the entire production. Edit the script carefully and deliberately before you go to the final cut.
- Move! Watching a statue is not interesting for your audience, no matter how fascinating your report is. Don’t be afraid of facial expressions or hand gestures as you are speaking. Your audience doesn’t want to watch a robot–they want to see a human being.
- During filming, record multiple takes of the entire script. Additional takes can be spliced together in the final video if your first take isn’t perfect.
Interested in seeing some of our work? Catch us at our YouTube channel or @WALegAuditor.
Dana Lynn is a research analyst and Emily Martin the committee communications administrative assistant with the Washington State Joint Legislative Audit and Review Committee.
Report Spotlight
Texas Sunset Process Overhauls Byzantine Alcohol Laws
A tiny bit of background for context: Texas, like many states, has a three-tier regulatory system to prohibit close business relationships between the manufacturing, distribution and retail tiers of the alcoholic beverage industry that led to excessive public drinking and industry corruption before Prohibition.
The Texas Legislature made sweeping, historic changes to state alcoholic beverage laws in 2019 after a Sunset staff report concluded outdated state laws and a weak regulatory agency led to an inefficient, ineffective and confusing regulatory system.
The four-member Sunset team began its eight-month review of the Texas Alcoholic Beverage Commission (TABC) in April 2018. The team started with a 287-page self-evaluation report from TABC, followed by extensive research of state and federal statutes and rules, past legislation, court cases, legal opinions, other audits and reports, trade publications and news coverage; meetings with agency leadership, staff and stakeholders; and visits to field offices and regulated businesses.
Sunset staff encountered some thorny political and analytical challenges along the way.
- Changing Texas’ three-tier system for regulating alcoholic beverages was off the table as a state policy matter outside Sunset’s scope.
- Since 1935, the legislature had taken a piecemeal approach to regulating new alcoholic beverages and evolving business models, leading to inconsistent, fragmented or duplicative laws.
- Industry interests in every tier were competitive and distrustful, complicating Sunset staff’s efforts to develop workable solutions.
- Some stakeholders raised unfounded alarms about the legality of proposals to fix longstanding problems–concerns that could be resolved but had been used previously to prevent change.
The team identified dozens of potential issues, ultimately focusing on those that would result in the most impactful changes to streamline and modernize agency operations and state regulations, without changing the three-tier system. In total, the staff report made 43 recommendations.
The Sunset Commission adopted all but two of the staff’s recommendations (with some modifications) and seven new ones. The Texas Legislature enacted all but three of these recommendations through HB 1545, the TABC Sunset bill, and made four more major policy changes beyond Sunset’s scope. Fittingly, the governor signed the bill in June 2019 at an Austin craft brewery, with the Sunset chair, vice chair, staff and other prominent stakeholders in attendance. TABC’s executive director and other members of the industry issued media statements praising the bill.
The 325-page bill’s reforms are so significant that the legislature gave TABC and the industry three years to phase them in. Among the biggest changes:
- Reducing the number of license and permit types from 75 to 36, eliminating 53,000 duplicative or unnecessary licenses and permits in Texas, and streamlining the state’s licensing fee structure.
- Eliminating the archaic distinction between “beer” and “ale” and uniformly regulating and taxing all malt beverages for the first time since Prohibition ended. The changes will lower state excise taxes by $300,000 a year, and require all malt beverage containers to include the alcohol content. Malt beverage brewers, distributors and some retailers will no longer need to have separate licenses and permits, keep separate books, file separate tax reports, and comply with two sets of laws for different types of malt beverages.
- Allowing small malt beverage manufacturers, such as craft breweries, to sell a limited amount of beverages for off-premise consumption, something wine and distilled spirits manufacturers could already do.
- Eliminating regulations that caused TABC to spend considerable effort and resources with little measurable impact on public safety, such as approving malt beverage labels, a duplication of federal regulation; mandatory testing for malt beverage alcohol content; and overly restrictive outdoor advertising requirements. These changes will reduce the industry’s label application fees by almost $90,000 a year.
- Increasing the package (liquor) store permit limit per person from five to 250 to accommodate expanding businesses with multiple locations across the state.
- Restructuring TABC’s license application approval, protest and appeal processes to align with best practices for regulatory agencies, improving consistency and accountability for applicants and TABC.
- Requiring TABC to establish a two-pronged approach for inspections of alcoholic beverage businesses that prioritizes public safety risks, and strengthening the agency’s enforcement authority to encourage greater compliance and better protect public safety.
- Increasing the size of the agency’s governing body from three to five members to provide more active oversight of regulatory policies and decisions, which, combined with other management actions adopted by the Sunset Commission, will help create an environment that makes the agency less susceptible to industry influence.
Merrell Foote is a senior policy analyst with Texas’ Sunset Advisory Commission. She can be reached by email or (512) 463-1929.
Special Reports
What's So Special About "Special Reviews?"
For decades, the Minnesota Office of the Legislative Auditor (OLA) has issued program evaluations and financial audits. But, at times, our office has seen a need to initiate what we call “special reviews” to address issues that are not on our regular evaluation and audit schedules. So, what are “special reviews,” and what’s so special about them?
These reviews may arise out of whistleblower complaints, allegations we receive from legislators or the general public or issues that come to our attention in other ways. Rather than always saying, “No, we don’t have the time to address that issue” or “The Audit Commission didn’t direct us to look at that topic,” special reviews are our way of trying to accommodate certain issues that arise.
The format of a special review report looks a lot like an evaluation report, but we try to make special reviews more narrowly focused. This is due partly to necessity—we have very limited staffing for this function. Since 2017, OLA has had a director of special reviews. Two other staff—the legislative auditor and OLA’s legal counsel—sometimes devote significant time to these reviews. We do not have other staff who are permanently dedicated to special reviews, although special reviews may occasionally borrow other staff from within the office.
Special reviews are part of a broader “complaint” function our office serves. The OLA website has a secure link where anyone can report what they perceive to be wrongdoing by state officials or agencies. In addition, state agencies are supposed to report to OLA instances in which their not-public data have been improperly disclosed, plus instances of theft or property loss. In 2019, OLA received and tracked about 300 complaints, allegations, data breaches and other reports.
Every two weeks, our internal “allegations team” meets to discuss new complaints, or to follow up on previous ones. The team includes the legislative auditor, the two deputy legislative auditors and the director of special reviews. For each complaint, the team asks: What are the facts we have? Do we have the jurisdiction to look at this? Are there more appropriate venues, such as grievance processes, for addressing this? How plausible and serious are the allegations, and should this be a priority for us? Ultimately, the allegations team determines collectively which issues merit further investigation—and, potentially, a special review.
We address many complaints informally, without issuing public reports. For example, OLA received multiple complaints in 2019 from a resident of the Minnesota Sex Offender Program (MSOP). (MSOP is a secure facility to which individuals may be civilly committed, and many remain there for the rest of their lives.) After twice telling the complainant that he should pursue his concerns using the facility’s grievance procedure, OLA determined that the grievance process was, in fact, problematic. After OLA conveyed its concerns to the facility’s administrators, the facility director committed to a review of the grievance policy. In late 2019, the facility adopted a revised policy that its administrators said would be less onerous.
Some special reviews are essentially investigations, and we may approach them somewhat differently than evaluations. For example, we often place interview subjects under oath. We make recordings of most in-person interviews during special reviews, and we may prepare transcripts. If necessary, we issue subpoenas to compel testimony or obtain documents.
OLA undertakes program evaluations only in response to decisions by the Legislative Audit Commission, a bipartisan body of 12 House and Senate members. In contrast, OLA usually does special reviews at its own initiative—as part of its general statutory obligation to monitor whether public funds are being used effectively, efficiently and consistent with legal requirements. At times, however, decisions on special reviews may be linked to those made by the Audit Commission. In 2017, the Audit Commission decided not to mandate OLA to evaluate the public fiscal impact of refugee resettlement, but with the understanding that OLA would undertake a special review to explore the availability of data on resettlement costs and benefits.
Sometimes we do special reviews to provide public accountability for high-profile state government failures, after the fact. Following the hugely disappointing implementation of the state’s new vehicle licensing and registration system (10 years in the making), we committed—at the request of key legislators—to conduct a special review that asked: “What happened?” Likewise, when the Department of Human Services disclosed that it had overpaid two Indian tribes $29 million for medications used in opiate treatment programs, we saw a need for our office to independently ask: “How did this occur?”
In 2018, right before the end of the legislative session, a local TV news report suggested the possibility that suitcases of money—perhaps fraudulently obtained from the state’s child care assistance program—were being sent overseas to support terrorist activities. This sort of provocative allegation demanded fact-finding. We initiated a special review, which found that there appeared to be significant fraud in the child care program, but we could not substantiate the allegation that money obtained fraudulently from the program was being redirected to terrorists.
Overall, special reviews are a small part of what our office does. But they are one way—in addition to traditional evaluations and audits—in which our office has tried to be as responsive as possible to legislators and the general public. We reject far more suggestions for special reviews than we accept. But we welcome the ability to occasionally initiate targeted, independent investigations that might not otherwise occur.
Joel Alter is the director of Special Reviews with the Minnesota Office of the Legislative Auditor. He can be reached by email or 651-296-8313.
What To Do When You've Been Told You Can’t Do What You've Been Asked To Do
Handling sensitive data is a part of our everyday lives as evaluators, and in our profession we know the phrases to use that will get us the information we need to perform our analysis (“de-identify the data,” "aggregate the data" or “this won't be a public record”). So then, what do you do when normal precautions, standards or statutes are questioned as being insufficient to protect the identities behind the data? The North Carolina Program Evaluation Division (PED) encountered this issue for its recent report, "Improvements Needed to Gauge Effectiveness and Expend State Funds Available for Postsecondary Financial Aid."
When my team and I were asked to evaluate the extent to which state-funded scholarships, grants and educational loan programs have met their goals and provide an examination of the educational and vocational outcomes of the students who received this support, we embarked on a data collection journey where at the start, the path seemed clear. We would analyze benchmarks and performance measures in our state’s higher education sectors. We would look at the state’s return on investment. We also would provide our legislators with details on how students performed on widely accepted outcomes such as graduation and employment rates. However, shortly into fieldwork, it became apparent that obtaining the necessary data to do such analysis was not going to be an easy task.
Because North Carolina does not have an active statewide longitudinal database of educational and vocational records, we discovered we wouldn’t be able to assess student outcomes without connecting the data ourselves. As such, we engaged entities based on what information we knew we would need:
- Student financial record data to confirm which students received which form of state-supported educational financial aid.
- The state’s public and private postsecondary educational systems to confirm the performance of the students who received a specific form of state aid.
- The state’s Department of Commerce to confirm the vocational status and details of the students upon and several years after graduation.
In total, we needed to work with five separate entities to craft a comprehensive timeline to detail when a student received state dollars, where he/she went to college, how well he/she did in college, when he/she became employed, and how well he/she was doing in the workforce.
So, as customary, we submitted our data requests to the appropriate entities. Shortly thereafter, we received bad news. The keeper of our origin data, the student financial aid records, told us that SAIG agreements with the U.S. Department of Education prevented them from providing us with student financial aid records. Additionally, some of our postsecondary educational systems said we could not have access to student educational records because of the Federal Educational Rights and Privacy Act (FERPA). Aggregate data was all that we could have, leaving us with no way to examine the granular level that we felt we would need to perform our data joins and tell the complete story. Agency attorneys determined our statute granting us access to confidential information did not meet exceptions to federal restrictions—including an audit exception outlined in federal law.
How would we be able to connect student groups to separate vocational records without a method of individual identification? Not only were our engaged entities wary of sharing data with our office, they were nervous about the legality of sharing this data among one another. Hence, simply requesting and receiving student-level information across the North Carolina school systems was not an option. The result was a proposed data flow outlined by four newly formed memoranda and the utilization of four preexisting agreements, as well as four months of daily phone calls and emails to manage the workflow.
In the end, these numerous agreements and the willingness of our office to trust the agencies to connect the data on our behalf allowed us to obtain de-identified student-level information that used a single unique identifier for each student (~ 45,000 students). We were then able to assess educational and vocational outcomes through five separate batches of data. We found that, typically, students who received state-supported educational financial aid in North Carolina and attended a public institution outperformed the students of their respective institutions as a whole. We were able to demonstrate to our members that providing educational financial aid to students promotes the success of our state's citizens. Long story short, when told to do something that we were told that we couldn’t do—we did one thing: persisted.
Adora Thayer is with the Program Evaluation Division of the North Carolina General Assembly. She can be reached by email or 919-301-1400.
Kansas Legislative Post Audit Finds Success with Telecommuting
The Kansas Legislative Division of Post Audit (KSLPA) encourages all employees to practice a healthy work-life balance. KSLPA promoted this philosophy by offering a very progressive telecommuting (work-from-home benefits) policy in an effort to retain employees and use it as an incentive when recruiting new staff.
Early on, as policies for telecommuting began to take shape, using this benefit was the exception, not the norm. For instance, an employee might only telecommute if they had a sick child. As the office experienced success, management expanded the policy to make telecommuting available every day and for all employees.
“The increasing flexibility of our work-from-home policy did raise some initial concerns,” said Justin Stowe, the agency head. “In particular, the management team was worried that our policies might adversely affect our productivity levels. So, we evaluated our productivity during the past 15 years. The results showed that we’re as productive today as we’ve ever been.”
The option to telecommute is very popular and continues to be a great benefit that is attractive to all employees, regardless of experience or title. Our 2019 employee satisfaction survey indicated staff liked our office's work-life balance and the expanded telecommuting benefits.
"The ability to work from home was one of the things I looked for when changing jobs," said Susan Belt, an associate auditor. "I live a fair distance from the office and working from home saves a lot of commuting expense."
By working from home, our staff can save anywhere from 10 minutes to one and a half hours of drive time each day. This also means our office is reducing our carbon footprint. On average, staff working from home save an average of 50 gallons a week, or $125 in fuel costs. This means our office will reduce fuel consumption by about 2,500 gallons of gasoline annually and save roughly $5,600.
“We are a small division, with only 24 staff. The advantages we see, both for staff and the environment, surprised me.” says Katrin Osterhaus, a KSLPA veteran of 22 years.
Last year, the management team piloted a program for long-distance telecommuting. This allowed one of our experienced auditors to remain on staff while residing in another state.
A key to the success of telecommuting is that staff don’t let it get in the way of doing their job. Staff can’t miss an in-person meeting or decline fieldwork to work from home. Most importantly, meeting deadlines and providing quality work remains the expectation, regardless of where the work is done.
Staff at all levels of the organization show commitment to successful telecommuting opportunities by leveraging technology like Zoom software for meetings and training staff on effective work-from-home strategies.
Macie Smith is an associate auditor with the Kansas Legislative Division of Post Audit. She can be reached by email or 785-296-4329.
A Valuable Partnership
Many of our most promising resources are often close to home. I would like to tell you about a unique partnership between New Mexico’s Legislative Finance Committee (LFC) and our flagship higher education institution, the University of New Mexico (UNM), that is leading to exciting things in advancing the field of program evaluation in New Mexico. A few years back, UNM established a Master of Public Policy (MPP) program and a mission statement that sounds like a dream for anyone recruiting program evaluators (that’s you and me!):
The Master of Public Policy (MPP) program will train a new generation of analysts who can serve their communities by identifying and championing data-driven policy options.
As someone who spent more than half of his life in a classroom, I can tell you coursework is vital. But putting what you learn in the classroom into practice is also key. Our Program Evaluation Unit at the New Mexico Legislative Finance Committee (LFC) has made it a priority to partner with UNM’s MPP program and have been able to do so in a number of ways.
- We have partnered with the MPP program’s Evaluation Lab, connecting students with a service provider who was evaluated by our office to provide an ongoing look at process and outcomes.
- We have partnered with the MPP program to pilot a Policy Laboratory where we are collectively working on a project examining important child welfare issues the state is facing.
- The LFC Program Evaluation Unit hosts an intern from the MPP program each year, assigning him or her to an evaluation and providing experience presenting to a legislative committee.
- I currently sit on the Community Advisory Board for the MPP program with a group of brilliant public policy professionals.
- Each year the MPP program invites speakers (including someone from LFC) as guest lecturers for an MPP class.
Each of these tasks gives students a glimpse into the world of public policy from the viewpoint of a program evaluator. Policy analysis, data analysis, interviewing stakeholders, developing findings and recommendations, and lawmaking are all key activities we can ideally familiarize students with. Imparting such knowledge through experience accomplishes a number of objectives, key among them ensuring students know about that options available to them when they finish school.
And the road of knowledge is not one way. We learn a great deal from MPP students and they make significant contributions to our work. Just last year, one of our MPP interns performed an analysis on criminal case outcomes in Albuquerque, as a part of an evaluation looking at New Mexico’s largest criminal justice system. The analysis found most felony case dismissals in the region were due to issues with evidence collection. This finding led to a recommendation to expand a program at the Albuquerque Police Department (APD) which had been shown to lead to improved evidence collection. APD adopted the recommendation, expanding the program by 300% earlier this year. MPP students are able to bring new methods and ideas from the classroom into the lab and New Mexico is already benefiting from this next generation of public policy thinkers.
I believe all participants in this partnership are helping to advance the field of program evaluation. While it might be unrealistic to expect that any given 9-year-old will want to be a program evaluator instead of a fighter pilot or a criminal prosecutor, it is realistic to expect that we can partner with those close to home to build a solid pipeline of motivated young professionals. It is incumbent on all of us to continue to look for these opportunities and continue to grow our field with the goal of improving outcomes for all of us.
Jon Courtney is the 2019–2020 NLPES Executive Committee chair. He can be reached by email or 505-986-4539.
Report Radar | Chris Latta (Pennsylvania)
Good day, fellow seekers and all those who aspire to truth, justice and the American way. Welcome to the winter edition of the Report Radar. I, your humble scribe, have searched across the fruited plain for studies from our brothers and sisters in program evaluation. Featured in this edition are reports on medical assistance, financial investigations, economic development, foster care service providers, and regenerative medicine.
First in the hopper is a report from our friends from the Badger State’s Legislative Audit Bureau on Wisconsin’s oversight provided by the Department of Health Services (DHS) of hold times at its medical assistance call centers. The department contracted out management for the call center, specifying various performance requirements for operations, one of which was requiring average hold times not to exceed five minutes.
The Audit Bureau reports that upon calling DHS, they learned the department was aware the contractor failed to meet some performance requirements. Of those failures, one was regarding call hold times. To improve the performance of the contractor, DHS requested a performance plan to correct the problems. Under continued monitoring from the department, the Audit Bureau learned that DHS was aware of continued instances of noncompliance. DHS subsequently reduced payments to the contractor by nearly $873,000 for performance issues in the period April through August 2019. The Legislative Audit Bureau reports that performance improved after the fines. To view the complete report, click HERE.
Next up is a report from the Arizona Auditor General’s (AG) Office. In 2019, the office conducted six financial investigations that led to 65 criminal charges against six individuals. The charges related to theft, misuse of public monies, fraudulent schemes, computer tampering and forgery.
For example, the Thunderbird Irrigation Water Delivery District office administrator allegedly embezzled $278,371 by paying for personal charges on district lines of credit and credit card accounts. The AG’s office also found that the business manager of the Ray Unified School District may have embezzled $38,333 from the school and $900 from a nonprofit youth sports organization. Finally, the office found that the business manager of the Valley Academy for Career and Technical Education may have embezzled $30,597 to use for personal purchases on an academy credit card.
If you would like to read the full report, you may find it by clicking HERE.
Next we travel east to Florida and the Office of Program Policy Analysis and Government Accountability (OPPAGA). In December 2019, the office released its latest report on economic development program evaluations. Under review were tax credit, tax refund and cash grant incentive programs. OPPAGA found that contracted projects receiving payments during the review period created 26,000 new jobs and made $1.5 billion in capital investments. The total number of jobs exceeded the number of committed jobs for all incentive programs. Florida’s Qualified Target Industry tax refund projects created the highest number of committed and confirmed jobs.
OPPAGA also found that many of the businesses received experienced employment and wage growth. They found that employment increased by 27% for the business locations they reviewed—an increase that was higher than the statewide growth rate for the same period.
For the entire report, click HERE.
Moving on to Kansas and the Legislative Division of Post Audit, we found an interesting report on consistency in foster care service providers. In its review, the division found that most foster families reported inconsistencies in some child placement agency services. In Kansas, the Department for Children and Families contracts with private organizations to provide foster care services. The department has policies and procedures to help ensure a minimum standard of service, but child-placing agencies can offer more or better services if they choose. The inconsistencies found in the report centered on how well the agencies helped foster families navigate the foster care system and the quality of interactions with staff.
The report identified an issue to further evaluate. Because of the limited scope of the audit, the Kansas Legislative Division of Post Audit was not able to test whether the department’s policies and procedures were effective in practice. The division believes that certain policies and procedures to help ensure consistency in services may be lacking effective enforcement mechanisms.
If you would like to read the report in its entirety, please click HERE.
Finally, we take a look at a report from the Minnesota Office of the Legislative Auditor. In January of this year, it released a report on regenerative medicine—specifically grant funding for the University of Minnesota and the Mayo Clinic. The program, called Regenerative Medicine Minnesota, received $8.85 million over two fiscal years. The OLA found that internal controls within the scope of the audit were generally not adequate to safeguard the funds and ensure compliance with legal requirements. Specifically, the audit identified internal control weaknesses related to proposal evaluations, project awards and grant project oversight. For example, OLA found one proposal evaluator had a conflict of interest, grant proposals did not always receive an equal level of scrutiny, and grant reimbursement requests lacked sufficient documentation to justify costs. The full report can be found HERE.
There you have it folks – our highlighted reports for the winter of 2020. If there’s a report you feel merits inclusion in the next edition of Report Radar, please feel free to send it my way. Until then, stay warm.
Chris Latta is a project manager with Pennsylvania’s Legislative Budget and Finance Committee.
News and Snippets
#Winning: Awards News
With the 2020 awards season around the corner, now’s the time to think about which award(s) your office will contend for! As a reminder, your office may submit applications or nominations for awards in four categories.
- Certificates of Impact: Awarded to legislative offices that released reports documenting public policy impacts within their respective states.
- Excellence in Research Methods: Awarded to legislative offices that have produced a report developed through the use of exemplary research methods.
- Excellence in Evaluation: Awarded to one legislative office that has made significant contributions to the field of legislative program evaluation during a four-year period.
- In addition, we will be seeking nominations for the Outstanding Achievement Award, presented to one individual who has made outstanding contributions to the field of legislative program evaluation. Please note, nominees do not have to be retirees.
We Need Judges!!!
If you are interested in judging one of the award categories, please contact a member of the Awards Subcommittee. No prior judging experience is required. Visit the NLPES Awards Program page for more information about award categories. Also, be on the lookout for more details via the listserv in Spring 2020!
Awards Subcommittee: Kiernan McGorty and Mary Jo Koschay
Staff Happenings
Kansas
Kansas Legislative Post Audit recently said goodbye to its longest-tenured employee. Ralph Richard “Rick” Riggs retired from the division in early June after devoting more than 30 years of service to the agency. Rick is a Topeka, Kan. native who graduated from Topeka West High School, received bachelor’s degrees in psychology and theatre from Washburn University, and a master's in public administration from the University of Kansas. He began his state service in 1975 as a correctional officer and later as a parole officer for the Department of Corrections. He also served as a management analyst for the Kansas Corporation Commission from 1981 to 1984.
Rick started his career at Legislative Post Audit in February 1984 as a staff auditor, participating in many performance audits. Some of his audit war stories include counting the number of cows in a field, sketching out how much food prisoners left on their plates at a Kansas correctional facility, learning all about noxious weeds laws, and identifying the financial impact of a property tax exemption for church parsonages. He has served as the division’s administrative auditor since 1998. In that position, Rick assisted the legislative post auditor with the division's policies and procedures and the hiring process, overseeing auditor training and development, compliance with federal auditing standards, and a host of other tasks and assignments.
During his 35 years in the audit profession, Rick has built up an expansive understanding of government performance auditing. He has served as the administrative auditor for three different post auditors and has been relied on heavily by all three to help administer the division in an efficient and effective manner and to ensure a smooth transition from one to the next. A two-year position as a radio broadcaster prepared him to lead the division in its latest accomplishment: publishing podcasts of completed audit reports.
Fun facts: Rick is married to Perrin Riggs and dad to many golden retrievers over the years. He and his wife started a dog training company (called Happy Training!) in 2000. Rick’s most common lunch has historically been something warmed up in the microwave, topped with … a can of room-temperature green beans.
Rick will be sorely missed by his colleagues, especially Chris Clarke, who now becomes the “oldest” one in the office. All of us at Legislative Post Audit wish him the best and hope he is fully able to enjoy the many benefits of retired life! Congratulations Rick!
Minnesota
In December 2019, the Minnesota Legislative Audit Commission appointed Jim Nobles to his seventh six-year term as Minnesota legislative auditor. Nobles became deputy legislative auditor for Program Evaluation in 1978, and he was first appointed legislative auditor in 1983.
Websites, Professional Development and Other Resources
Under Construction on the NLPES Home Page
Keep an eye on the NLPES home page as you will see the appearance of the Report Library (or also found under About Us) to assist you in finding recent reports from other states. The Report Library lists the titles of recent reports by subject area that may be of interest in your work. This is a work in progress and we are testing it with reports released by the agencies represented on the NLPES Executive Committee. If your response to using the Report Library is positive, we will expand it to include all offices and all reports released in the last five years. NLPES Listserv
The NLPES listserv is an email discussion group for NLPES members. By sending a message to NLPES Listserv, you can reach all listserv subscribers simultaneously. Listserv members can query other states about evaluation work similar to their own projects, receive announcements about performance evaluation reports and job opportunities from other states, and are notified when the latest edition of this newsletter is available! To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv. See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.
Are you receiving our listserv emails? Some states’ systems block NLPES listserv emails. If you think you are not receiving our emails, please check your state’s security system and spam filters, and/or contact Brenda Erickson. Legislative Careers Website
Know someone thinking about pursuing a career with a state legislature? Point them to the opportunities posted on NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs. Attracting young people to work as legislative staff will be increasingly important in the coming years. And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career.
NLPES's Professional Development Resources
Visit our NLPES online training library for a variety of refresher and training materials! There are nearly two dozen resources on planning and scoping, fieldwork, writing and publication, and management topics. Most are PowerPoint slides, some are narrated, and a few are webinars or podcasts. Check them out!
Ask GAO Live
AskGAOLive is a 30-minute interface where GAO staff chat about a specific report and research, and answer questions that are emailed or tweeted in. Sessions are recorded and archived on the website. You can also “follow” GAOLive to receive advance notice of chat sessions. Topics include veterans and higher education, prescription drug shortages, prison overcrowding, state and local fiscal outlook, and government contracting.
Ensuring the Public Trust
What’s the most common internal performance measure for evaluation shops? How many offices tweet? What percentage of staff has fewer than 10 years of experience? How can you contact a sister office in another state? Ensuring the Public Trust summarizes information about legislative offices conducting program evaluations, policy analyses and performance audits across the country.