Skip to main content

The Working Paper | Summer 2019

July 29, 2019

Chair's Corner | Shunti Taylor (Georgia)

In the words of the classic aria from "Porgy and Bess," “Summertime, and the livin' is easy…” I love summer and everything that comes with it—sunny weather, relaxed schedules, and as many trips to sandy beaches as I can fit in. But, until I retire or win the ever-elusive lottery, there is still work to be done. No matter the season, the Executive Committee continues to work on your behalf, striving to deliver all the benefits of membership in NLPES. Here’s a quick recap of the Executive Committee’s work over the last several months.

  • Due to the efforts of our Professional Development Subcommittee, NLPES continues to be a frontrunner of e-learning opportunities through the provision of webinars. The webinars are made possible by e-learning grants available from the NCSL Foundation. There’s at least one more webinar planned before year-end, so stay tuned. And if you have an idea for a webinar, please contact Brenda Erickson for details.
  • During our 2019 election cycle, member offices elected four members to the Executive Committee. Two current members of the committee were reelected to serve another three-year term. In addition, two new members, Paul Navarro (California) and Mary Jo Koschay (Michigan) were elected. They will begin their terms at the fall PDS.
  • As you know, the Executive Committee is considering proposed changes to the NLPES bylaws. Now that the review and comment period has passed, we hope to finalize any changes to the bylaws prior to the Executive Committee’s fall meeting.
  • The Communications Subcommittee continues to bring you exciting news you can use through this amazing newsletter. In addition, keep an eye out for the launch of the NLPES report library and an updated survey of member offices.
  • I would like to congratulate all the 2019 NLPES Award winners. I’d also like to send a very special thanks to those who judged awards this year. The NLPES awards are an opportunity to recognize the accomplishments of our member offices and individual staff. As usual, our awards program generated lots of interest by our member offices, which resulted in a very competitive process. Award winners will be recognized during the NLPES Awards Luncheon held during the Professional Development Seminar this fall.

The Executive Committee held its spring meeting in Park City, Utah where it reviewed the status of ongoing subcommittee work and discussed plans for the upcoming Professional Development Seminar (PDS) in Park City. Utah’s Office of Legislative Auditor General is busy planning a great three-day program taking place on Sept. 15-18, 2019. Registration and details about the PDS are available on the NLPES website. Many thanks to our friends in Utah for the time and effort spent planning the PDS and for your all-around support of NLPES!

As I near the end of my term as chair, I want to thank the members of the Executive Committee for their support and commitment to NLPES. I especially would like to express my sincere appreciation to Greg Fugate (Colorado) and Melinda Hamilton (Michigan), both of whom resigned from the Executive Committee earlier this year. During her two and a half years on the committee, Melinda served on the Awards Subcommittee and held the position of Secretary of NLPES. Since Greg’s election to the Executive Committee in 2008, he served on every subcommittee, and assumed various leadership positions, including a term as NLPES Chair. In whatever role he played, Greg continually demonstrated his commitment to improving the organization and ensuring NLPES remained a valuable resource to the membership. Thanks to both Greg and Melinda for their service.

I have had the pleasure of working alongside Jon Courtney (New Mexico) who will assume the position of NLPES Chair at the PDS in Park City. I look forward to working under Jon’s leadership and with the rest of the Executive Committee as we continue to advance the mission and goals of NLPES. If you haven’t had an opportunity to serve in this important leadership role, I encourage you to consider running for a seat on the Executive Committee next year. At a minimum, I hope you’ll continue to take advantage of the many resources NLPES and NCSL has to offer.

I have enjoyed the privilege of serving as NLPES Chair and I hope to see you at the PDS in Park City!

Shunti Taylor is the 2018–2019 NLPES Executive Committee Chair.

PDS Preview—See You in Park City! | Wayne Kidd (Utah)

The Utah Legislative Auditor’s Office is excited to host the 2019 PDS in Park City, Utah, from Sept. 6-18. Park City is great place to visit in September. The weather will be pleasant, the leaves beginning to change color, and many outdoor activities will be available. Plus Park City has many great restaurants.

We will be offering four tracks this year:

  • A basic skills track for newer program evaluators
  • Two general tracks that will cover skills enhancement, audit related topics (including public education, higher education, and transportation), and managing our offices more effectively
  • A track will be a data toolkit track that will be a “hands-on” experience. Participants will have the opportunity to learn and practice simple scripting, preparing data for analysis, analyzing large data sets, and using visual analytic tools

Sunday and Monday evening will be open for you to enjoy Park City. However, Tuesday evening we will be going to the Utah Olympic Park for a tour, dinner, and an aerial show. You won’t want to miss that event. We look forward to seeing you!

Fun Fieldwork

If you have a fieldwork experience your office would like to highlight, please email submissions to Emily Johnson.

Our Work and Real Life | Rick Jones (Pennsylvania)

After more than 20 years at this type of work, I have come to realize after all the projects that have passed through this office on seemingly countless policy topics—some thought-provoking and others more routine—it is few and far between that studies have moments to be remembered in a personally meaningful way. The Pennsylvania LBFC’s (Legislative Budget & Finance Committee) 2007 report on the state’s organ donor program was, for me, such a project.

House Resolution 698 of 2006 directed the LBFC to undertake a performance evaluation of the state’s Organ and Tissue Donor Awareness Program. Thus, for the better part of a year the project team did what our staff has a solid reputation for doing—we reviewed agency efforts, surveyed other states, interviewed officials and stakeholders, used state and federal data to examine donation and transplantation activity, analyzed the fund’s structure, reviewed contracts and reports, and gave fresh eyes to the relationship between state and federal law regarding the transfer of human organs for valuable consideration, among numerous other things. In the end, we produced a well-researched, detailed report that capably addressed seven key findings and made 26 recommendations. This was all very typical.

Along the way, however, I had one not so typical meeting that reminded me that the work we do in evaluating the efficient and effective performance of state government programs is not simply a detached policy exercise but relates to real life and real people more than is predominantly perhaps plainly apparent. I suppose I knew this in my head but was allowed a moment to perceive it just a little better.

Two of the 58 certified OPO’s—organ procurement organizations—nationwide are based in Pennsylvania. OPOs generally are responsible for increasing registered donors through outreach campaigns and also coordinating the actual organ donation process. We visited two Pennsylvania-based OPOs over the course of the study and while being toured through an OPO’s associated organ donation facility, I was granted my not so typical moment. Maybe it cannot be regarded as ‘epiphanal’ but then again it might be as close as it comes. As our team neared the end of our visit, a staff person entered pushing a cart on which lay a human kidney. It was somewhat unexpected to see an actual human organ a mere arm’s length away, but I found myself even more so a little amazed to see it was pulsing—continually moving ever so slightly out and then in as if alive. It was, of course, pulsing because it was attached to a machine that moved fluids through it for preservation purposes. In that moment, though, that kidney was remarkably real and its movement—though mechanically produced—underscored, for me, its connection to real life. 

Over the years, I have visited rural as well as city offices, met with police, liquor enforcement, executive senior staff and the judiciary. I have had explained to me the science of biometric smart cards, viewed blighted property, studied the use of fingerprinting, explored the potential for commercial advertising on public property and considered (beyond what I would have thought possible) various aspects of game, fish, and boats, including the merits of agency mergers and the social aspects of hunting on Sundays. There were interscholastic athletics, vocational rehabilitation, debate over the statutory requirement of a physician certification for drug and alcohol treatment, collection and dissemination of Criminal Justice Record Information, the procurement code, opportunity zones, probation caseloads and county recording fees. I have addressed issues with constitutional implications and spent much time analyzing the importance of the meaning of words in statutory interpretation and construction. I have watched people convicted of drunk driving cry openly as they listened to survivors share their stories of extraordinary pain and loss and I have even trudged across foul-smelling farm fields treated with human biosolids—all in the name of program performance evaluation. In retrospect, though, having a human kidney pulsing on a cart directly in front of me without doubt was unlike any other project-related activity before and since.

In all seriousness, a man’s life had ended. As I looked at his kidney in front of me, I found myself thinking that this kidney had been filtering blood, removing waste and doing what kidneys are designed to do helping to keep a man alive hours earlier. It also came to mind that real persons—family and friends—were in the process of coping with this man’s loss and their changed lives. Finally, I guess it struck me a bit oddly that this random, anonymous man’s kidney reached the end of its usefulness on display to help with our study. It had not qualified for actual transplantation and so its final use was to make an impression on us. Hearing this, maybe I needed to attach greater meaning to the moment. This was neither a program file nor an array of data nor a stakeholder’s feedback nor a state agency legal opinion—it was a real, physical part of a real human life linked to other real persons with real lives. That moment helped me not only visualize the importance of organ donation but also gain a healthier understanding about what we do and how it connects to real persons and real life. 

At this point in retelling this incident, I could easily turn to address the ongoing issues of organ donation. For example, today about 114,000 people await donation, up to 20 of those waiting die daily, and a single donor—each single donor—has the potential to save up to eight lives. Instead in closing, however, I just want to genuinely thank that anonymous gentlemen for giving his kidney. While it could not be transplanted, facing it that day at the OPO did allow me deeper insight regarding this work. I hope and pray his family and friends are well.

Rick Jones is Counsel for Pennsylvania’s Legislative Budget and Finance Committee.

Technology

Using Tableau to Enhance Our Work | Bart Liguori (Kentucky)

The Kentucky General Assembly’s Education Assessment and Accountability Review Subcommittee prescribes education research that is conducted by Kentucky’s Office of Education Accountability (OEA). OEA produces approximately four reports each year on topics in public K-12 education. OEA’s reports are designed to be printed and bound. The reports are also published on the General Assembly’s website as pdfs. In an effort to engage readers and personalize reports, OEA decided to include interactive features as part of its reports.

OEA considered a business intelligence tool to help produce interactive features, and purchased Tableau in the late fall of 2016.

Beginning in 2017, OEA started using Tableau to create interactive features. So far, OEA has used Tableau dashboards to supplement its report on School Attendance in Kentucky, and its 2017 and 2018 District Data Profiles. The purpose of these interactive features was to complement the print copies of the report, not supplant them. The features also allow readers more opportunities to interact with the data.

Based upon OEA’s implementation of Tableau, there are some tips and pointers I would like to share:

There were minor issues getting the graphical interface to work with other programs that were previously installed. After dealing with Tableau’s customer support numerous times we were able to diagnose and correct the issue. The customer support team at Tableau was helpful; for those who prefer a live person, be aware that much of the support was via email. 

Tableau allows users to create interactive visualizations as worksheets, dashboards and stories. Users are able to share the visualizations as read only documents. These read only documents can be viewed with Tableau Reader. Users of Tableau Reader have to install and update the software regularly. For those that do not want to use Tableau Reader, visualizations can be published to the Internet with “hidden,” but live, links. The links can be “unhidden” once publication is authorized.

While learning Tableau was intuitive for some staff members, for other staff members there was a bit more of a learning curve involved. As is the case for introducing any new piece of software, there may be some initial reluctance to adopt the new technology. There should also be time built into analysts’ schedules to learn how to use the software. However, once analysts learn how to use the software, the Tableau user community is active and there is a lot of online support available.

In creating interactive dashboards for reports, there are some considerations to remember:

  • Not all reports lend themselves to interactive features.
  • Make sure that the visualizations are mobile friendly.
  • Remember to choose the right chart type. 
  • Do not make the charts look too cluttered or add too many filters.

Tableau was recently acquired by Salesforce. An FAQ about the merger can be found here.

Dr. Bart Liguori is the Research Division Manager at the Office of Education Accountability at the Kentucky Legislative Research Commission.

Report Spotlight 

Editor's Note: Similar to last edition's Report Spotlight, this contribution highlights an office's evaluation of its state's tax expenditures. A more offices are conducting these types of evaluations, we decided having multiple articles on the subject would be beneficial to the membership.

Colorado Shining a Light on State's Numerous "Tax Expenditures" | Kevin Amirehsani (Colorado)

As part of a growing trend of states more intentionally and thoroughly evaluating their tax expenditures (i.e. tax credits, exemptions, deductions and rate reductions), which often aren’t widely understood nor their individual revenue impacts generally publicized. The Colorado General Assembly tasked our office with publishing periodic reviews of our state’s tax expenditures in 2016. Since then, we have identified 216 such expenditures, evaluated 61 of them in 39 separate reports and made a number of presentations to different legislative committees. Our work does differ significantly from that of typical performance audits, but many of our takeaways apply to all types of legislative program evaluations.

Evaluation teams in other states often only review those tax expenditures which are economic development incentives geared towards encouraging businesses to locate there and/or increase employment, while we evaluate all expenditures, including sales and excise tax exemptions, as well as those expenditures seen as more “structural”. And although our Department of Revenue has published biennual reports on Colorado’s tax expenditures since 2012, they mostly focus on providing revenue estimates and tax/tax expenditure incidence tables across income groups. They’re also limited to tax expenditures administered by the Department of Revenue and don’t include, for example, insurance premium tax, which in Colorado is administered by our state’s Division of Insurance. State statute asks us to go into far more detail for each individual expenditure, including determining their purpose(s) and beneficiaries, crafting performance measures to evaluate them, performing in-depth data analysis and economic outreach in order to determine whether they’re meeting their purpose(s), examining their economic costs and benefits, comparing them to similar expenditures or programs in Colorado and other states and, whenever possible, offering the General Assembly policy considerations to improve the administration, effectiveness and efficiency of each expenditure.

Let me briefly go over three takeaways from our evaluation process that your office might find useful.

State tax expenditures often interact with local and federal tax policies, which we try to elucidate through tables and graphics, such as the above exhibit of how one particular income tax expenditure works, in our evaluation of the State Income Tax Refund Deductions.

First, diving in deep to the legislative history of an expenditure is usually integral to figuring out why it was created in the first place and how to actually evaluate it (don’t get me started on late 19th-century bills…). But so is conducting historical research and chatting with industry stakeholders who have been in the business for years. While you might think it’s obvious why some tax expenditures were created – say, to incentivize charitable donations or to provide tax parity between different types of businesses – others are more vague, and small differences matter. For example, we inferred that a certain insurance premium tax deduction was created to lower tax-exempt employers’ costs of providing insurance to their employees. However, if our research instead lead us to believe that it was enacted to incentivize the provision of these insurance plans – a small but significant difference - that would have meant a very different type of analysis, and potentially even a different conclusion on the expenditure’s effectiveness.

We sometimes include simple infographics in our evaluations, like this one on “tax pyramiding” within our evaluation of Colorado’s agricultural inputs sales tax exemptions.

Laying out each expenditure’s economic costs and benefits in a fair way is also crucial to allowing the General Assembly to assess it. Even though we always start with our Department of Revenue’s revenue impact for each expenditure (if they published one), the more time we have on each expenditure allows us to pull in other data sources, incorporate stakeholder feedback and, often, revise the revenue impact estimate. Similarly, we do our best to figure out how (if at all) the tax savings are spent by the claimants. Are they reinvested into their business or used for consumption, which would imply more of an in-state economic impact? Or are they kept as profits and put into savings or investment accounts (in which case, there would be less in-state economic impacts, at least in the short-term)? Are businesses passing on the tax savings to their customers entirely? Here, we have sometimes additionally found it valuable to conduct an “opportunity cost” analysis (using the IMPLAN economic impact analysis software) and compare an expenditure’s estimated economic impact to the estimated statewide impact if the General Assembly decided to eliminate the expenditure and spend the money on typical General Fund priorities instead.

IMPLAN has helped us provide scenarios of different economic impacts associated with certain tax expenditures, as well as what those impacts might look like if the State eliminated the expenditure and kept the money, such as in our evaluation of the Historic Property Preservation Credit.

Finally, the General Assembly has found our reports valuable enough to convene an interim study committee to meet and draft bills based on them this summer and fall, while the Legislature is out of session. This may have been because we always try to link back how each expenditure is currently being interpreted, administered, claimed and understood by stakeholders with the General Assembly’s original intent in creating it. If there appears to be a disconnect or major issue, or if changing circumstances might merit another look at the expenditure, we briefly mention it at the end of our reports as a “policy consideration”. Unlike audit recommendations, which are based on strict criteria and recommend specific actions that should occur based on the reasons that criteria isn’t being followed, we’ve used policy considerations in our tax expenditure reports to identify issues legislators may wish to address without providing a recommendation on what should occur. This approach allows us to outline the issues and potential options legislators may want to consider while leaving the policy choices to them.

You can check out all of our published tax expenditure evaluations here and here.

Kevin Amirehsani is a senior legislative performance auditor/independent contributor with the Tax Expenditures team at the Colorado Office of the State Auditor.

Report Radar | Chris Latta (Pennsylvania)

“Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing ever happened.” – Winston S. Churchill

Good day fellow seekers and welcome to the latest edition of the Report Radar. Here we have your one-stop shop for interesting searches for the truth (otherwise known as performance evaluations) from across the fruited plain. I, your humble scribe, shall act as Sherpa on our journey.

First in the hopper is a report from the Keystone State where the Pennsylvania legislature’s Legislative Budget and Finance Committee (LBFC) recently released a performance audit of two permitting programs within the Pennsylvania Department of Environmental Protection. The permitting programs under review were the Erosion and Sediment Pollution Control (ESPC) program and the Water Obstruction and Encroachment (WOE) program.

LBFC found that stakeholders, both internal and external, expressed a wide variety of concerns about the management of the permit application process. The two issues raised most frequently were inconsistencies in the interpretation of regulations and requirements of the permitting programs, between and among DEP regional offices and County Conservation Districts and the length of time for permit applications to be reviewed and disposed. 

The committee also found that DEP does not systematically collect, compile, analyze and report data to measure the performance of the County Conservation Districts or DEP regional offices that review permit applications. LBFC staff conducted a file review of DEP permits to gauge whether the County Conservation Districts and DEP consistently adhere to the required review process and internal controls established by the Department. After reviewing 440 files, LBFC found that DEP internal controls were so poorly followed they were rendered totally ineffective.

Finally, LBFC found that DEP does not document whether its ESPC and WOE programs are protecting the environment.

The report in its entirety can be found HERE.

Moving on to the Kansas Legislative Division of Post Audit, we found an interesting report evaluating Kansas’ Animal Facilities Inspection Program. The division found that although the program’s policy was insufficient to make sure inspections were consistent, they did not identify any significant problems for the inspections the division observed. Specifically, they found the program policy adequately addressed 5 of 16 main categories of facility requirements or best practices identified by the Division. However, the inspection program did not have a policy, or had an inadequate policy to address the remaining 11 areas. These areas included requirements for adequate temperature, space and feeding.

The Kansas Legislative Division of Post Audit also found that the policy of the inspection program did not address best practices and penalties were not always consistent or appropriate. The division further found that while the Agriculture Department needed to use discretion, the agency lacked policy guidelines to ensure that discretion was consistent, appropriate or progressively secure across facilities.

Finally, our brothers and sisters in Kansas found the Department needs to improve key processes related to oversight and training. They found the program does not set performance metrics for inspectors; processes need to be improved to ensure inspections are timely; and lacking oversight most likely contributed to several late inspections and inconsistent penalties against facilities.

A complete copy of the report can be found HERE.

Next we take a trip down to Old Dominion and the Virginia Joint Legislative Audit and Review Commission (JLARC), where they conducted an evaluation of the implementation of the STEP-VA program. STEP-VA is a long-term initiative to improve community based behavioral health services. JLARC found that while the implementation of same-day access to behavioral health assessments showed positive results, the program’s goals have not been achieved. Specifically, while consumers’ needs are assessed more rapidly, they are not receiving needed follow-up services more quickly. 

JLARC also found that the second step of STEP-VA, providing primary care screening to consumers at higher risk for physical health issues, is on track, administrators are concerned that after this step is fully operational, it will detract from other, higher priority STEP-VA services, such as expended outpatient and crisis services. A complete copy of this report can be found HERE.

Heading west to Washington’s Joint Legislative Audit and Review Committee (also JLARC), we found an interesting report on Tax Preference Performance Reviews. JLARC staff reviewed the performance of 17 tax preferences and compiled their findings into nine reports.

The largest tax preference is the Aerospace Tax Preferences at $569 million. This particular program applies to businesses that manufacture commercial airplanes, develop aerospace products or repair aircraft. The preferential tax treatments apply to business and occupation (B&O) tax rates, B&O tax credits, sales and use tax exemptions, property tax exemptions and leasehold excise tax exemptions. JLARC found that this tax preference program cut the effective tax rate for hypothetical large and small aerospace manufacturing firms by over 50 percent and improve Washington’s competitive position. 

JLARC also conducted a review of a Hog Fuel tax preference. Hog fuel is wood waste or other residuals from lumber mills, construction or demolition sites. It also includes parts of trees and other woody debris from timber harvesting or forest thinning. The committee found that beneficiaries of the tax break are exceeding their goal of retaining 75 percent of jobs at their facilities. Sixteen facilities used the tax preference in 2017. Of those 47 percent of their employees’ annual wages were $60,000 or more. The average annual wage in the counties where these facilities are located is $51,000. 

A complete copy of this report can be found HERE.

Well, there you have it, another edition (or maybe addition) of the Report Radar. If you have any suggestions for reports studies, or reviews that are deserving of attention, please send them to me

Chris Latta is a Project Manager with Pennsylvania’s Legislative Budget and Finance Committee.

Special Contributions

Assessing the Impact of Early College High Schools in New Mexico | Alison Nichols (New Mexico)

In 2019 the New Mexico Legislative Finance Committee (LFC), the state’s non-partisan evaluation and fiscal analysis body, conducted a quasi-experimental study to determine whether an innovative educational model – the early college high school – improves college outcomes. Using data from individual schools, the state’s education department and the National Student Clearinghouse, LFC evaluators found promising but preliminary effects from early college high schools, suggesting a need for further experimental research.

Addressing poor educational outcomes in New Mexico through the early college high school model. New Mexico ranks low nationally on many key indicators of high school and postsecondary success, with significant achievement gaps between different groups. The state had the second-lowest four-year high school graduation rate in the country in 2017 at 71 percent, and English learners and economically disadvantaged students had lower rates. The state also faces achievement gaps in college degree attainment, with 37 percent of white residents earning a bachelor’s degree or higher, compared to 13 percent and 9 percent, respectively, of Hispanic and Native American residents. Low high school graduation and college degree attainment rates have effects on residents’ job outcomes, lifetime earnings and other important indicators.

One model designed to increase students’ chances of postsecondary success is the early college high school (ECHS). ECHSs combine high school coursework with structured college coursework, industry certifications and work experience, allowing students to earn college credits and complete some or all of an associate degree while still in high school, or earn an industry-recognized workforce certification. ECHSs target students underrepresented in higher education, such as low-income students or would-be first-generation college students.

The ECHS model has expanded in New Mexico over the past decade, with approximately 3,000 students in 20 schools. While research from other states shows positive effects on student high school and college achievement, there had not been any rigorous analysis of New Mexico outcomes. The New Mexico Legislative Finance Committee (LFC) wanted to answer the following question: Does attending an ECHS lead to better postsecondary outcomes for students, compared to similar students who did not attend an ECHS?

Do early college high schools in New Mexico work? In order to understand whether ECHSs lead to better college outcomes for New Mexico students, the LFC developed a quasi-experimental research study, in partnership with the Abdul Latif Jameel Poverty Action Lab (J-PAL) at the Massachusetts Institute of Technology. Randomized admissions lotteries at some ECHSs created an opportunity to track the outcomes of similar students – that is, students who wanted to attend an ECHS, but who were randomly accepted or not accepted for admissions. The LFC identified two schools that implemented randomized lotteries and retained records, and used records to create a treatment group (ECHS students) and a control group (non-ECHS students). Using college records from the National Student Clearinghouse, LFC evaluators identified degree attainment for students in each group and analyzed differences in outcomes, controlling for low-income and English learner status, to further ensure that the two groups were similar in terms of key demographic indicators. J-PAL provided technical support from its network of academic researchers.

Promising degree attainment outcomes. The analysis found statistically significant effects on degree attainment for students who
attended an ECHS, compared to those who did not attend. Key findings included the following:

  • Almost 15 percent of students who attended an ECHS attained a bachelor’s degree, compared to 11 percent of control group
    students, while 9 percent of students who attended an ECHS attained an associate degree. Only 5 percent of students in the control
    group received an associate degree (see chart at right).
  • Students who attended an ECHS for the full three-year duration of the program were more likely to attain a degree than those who
    attended for just one or two years.

Further research on early college high schools in New Mexico needed. The effects from the LFC study are promising, and provide evidence that may support expanding the ECHS model further in New Mexico. However, conclusions from the study are limited by a few
factors. The study only examined two schools, and while using lottery data reduces the likelihood of meaningful differences between the treatment and control groups, there may be other differences that do not show up in student records.

The LFC recommends further, rigorous evaluation of the ECHS model in order to assess the broader impact of these schools in the state.
For example, requiring schools to implement and keep records on randomized admissions lotteries for a certain period of time could
allow researchers to assess the effects on larger, randomized treatment and control groups.

Alison Nichols is aSenior Fiscal Analyst with New Mexico’s Legislative Finance Committee.

A "New" Perspective | Emily Johnson (Texas)

In an effort to engage newer staff in NLPES activities – beyond attending the PDS – I interviewed four individuals new to the field of program evaluation to hear what they think of the job so far and get their perspective on a variety of topics.

Q: You all indicated our offices don't always do the best job communicating what exactly performance auditing/program evaluation is. Is that something we should improve and if so, how?

Jacob (North Carolina): First, though each of us uniquely found our way into the program evaluation universe, it would be without aggrandizement to suggest that increased advertisement of the position – both in its importance to governing and government and in its applicability to a wide range of specialties – would have helped grease the wheel so to speak. The next generation of program evaluators, trained in modern techniques and replete with capabilities to take program evaluation and mold it to new challenges in the field, will only help benefit state legislatures across the country.

Secondly, many reports conducted by our offices are not just used as legislative white papers and sealed in a legislative vacuum where only legislators can access it with a biomarker designed for legislators. Public engagement with our reports, improved by a wider and improved communication strategy, would increase the readership of the work. From academics to industry groups, private citizens to concerned voters, program evaluation can be a tool to shed light on possibly byzantine subjects. We should not keep this tool in the dark. 

Mikayla (Texas): Yes, I get the impression from the public and even some of the agencies we review that our mission, function and value to the state are often misunderstood. I believe organizations in our field can communicate our value more effectively by creating or enhancing public engagement across multiple types of mediums, such as our websites, social media platforms, webinars or hosting talks or seminars. We must also increase the accessibility of our work, including providing documents and resources in other languages. We should make more of an effort to measure our own successes more holistically and advertise our successes.

Shanika (Pennsylvania): Yes, communicating what exactly performance auditing/program evaluation is from the initial job posting. It would be most helpful for state entities to invite applicants to their website or social media page(s) and tell them where they can find examples of previous audits/program evaluations the office has completed.This in itself will show the potential applicant the vastness of the projects.

Also, mentioning the specific degrees that are most suitable/relatable to the position (i.e. Master in Public Administration) will allow a career seeker to understand how their skills may be transferable to auditing/program evaluation. In addition, adding key words such as research, statistics, etc.; these would be more helpful than just a job title per se, which may be vastly different than their previous job title(s).

Tanner (Kansas): I believe that communicating the truth of what our work requires will always be difficult and perhaps impossible to do perfectly. I also believe we should continue trying to craft how we characterize our work to prospective auditors. At my office, that means that we have our core mission statement and we build around that, then we emphasize how the work can vary from audit to audit. For now this seems sufficient to me, but it is important to continuously re-examine the process to be sure how we describe the work to new auditors is helpful to early development.

Q: Were there any expectations you had about the profession that you've since realized were completely unrealistic or didn't match up with reality? 

Jacob: I was told repeatedly during the interview process that one must become immediately comfortable with the realization that each day will not be the same. This dynamic workstyle is unique, and what has taken me over a year of on-the-job experience to understand somewhat is this strange conclusion about this profession: one never truly masters the job. The processes, project and time management, communication skills, summation and analysis skills, even writing will improve with time and experience. However, as we are constantly evaluating differing topics, from the needle to the entire haystack so to speak, the learning process is endless. 

Mikayla: Since starting my role, I have been surprised with how inconsistent (or non-existent) data about agencies and programs actually is. I was also surprised to find that because of the nature of our work, we try to grow as generalists, not necessarily as subject matter experts on any one sector. We value the ability to adapt and learn about many subjects in a short amount of time.

Shanika: Having never worked in state government—I had no idea of how it all worked. I was extremely naive to think resources to meet an objective(s) were limitless and we all work in unison. What I quickly realized is that all state entities are in a sense separate and there are processes in place to govern each individually. In addition, I expected the working relationships to be easy and information forthcoming and that is not always the case at times for one reason or another.

Q: What do you rely on the more senior auditors/evaluators in your office for most?

Jacob: As a neophyte to government service, I am constantly looking to the experienced evaluators in my office for advice and guidance. This mainly includes strategies for working with agencies to receive critical data or necessary information to inform our evaluations. But additionally, this includes my attempt to collect as much information on anything related to our state government apparatus. Often the missing link in forming evaluations of a particular agency or policy requires the possession of some desultory piece of knowledge that would have otherwise been impossible to come across had one not been present at that moment. For example, didn’t you remember that that agency was originally under this department in 2009, but after the reorganization was moved under this other department? This is not information one learns in graduate school, or in reading training manuals, but rather is learned on the job, in working with more experienced peers. 

Mikayla: I look to the senior staff for mentorship and feedback. It is helpful to know that I can go to them, ask for honest feedback about my progress, and understand how I can continue to grow. I also go to them to ask questions about how we approach our work. They are a great resource within the office.

Shanika: To help me navigate the auditing/evaluation process – whether it be interpreting legislation, understanding the objectives of a project, policies and procedures and/or to just simply brainstorm.

Tanner: I rely on Senior Auditors primarily for an example to follow. I watch for how they conduct themselves in an in-person interview, how they respond to certain auditee behavior and I even watch how they structure their calendars and manage their time.

Q: What one piece of advice would you give someone just entering the profession?

Jacob: Suspecting that many new evaluators may feel some unsettling combination of confusion at how to contribute and a lack of experience, I would suggest they take ownership of some unique corner of their first project. It will likely be a subject not researched previously. But it will benefit the project if the new member can become a resource on a particular aspect, be it the financial data, the methodology, the literature review, the legislative history, the related state and federal agencies if applicable and so forth.

While this strategy will benefit the project as it gains a valuable specialized resource, it will also provide a buoy of confidence for the new evaluator.To play out this metaphor, this will avoid a feeling of lost at sea in the first few weeks or even months. As reports are generated and data is analyzed, the evaluator will have a steady workflow and have produced an outcome that will demonstrably aid in his/her first project.

Mikayla: Learn as much as you can and ask a lot of questions. Be patient with yourself and understand it takes time to adjust to such a unique role.

Shanika: ALWAYS remain objective when approaching every audit/program evaluation.

Tanner: Always take time to review every piece of work as if you were your own supervisor. It helps you view your work from a more neutral perspective and you will be more likely to catch areas that need improvement.

News and Snippets

The Election Results Are In! | Linda Triplett (Mississippi)

As the 2019 NLPES elections chair, I'm pleased to announce that the winners of this year’s election are Emily Johnson, Mary Jo Koschay, Kiernan McGorty, and Paul Navarro. Please join me in congratulating our two returning members of the NLPES Executive Committee (Emily and Kiernan) and two new members (Mary Jo and Paul).

This year’s election was very competitive. A special thanks to everyone who voted and ran for office. We received a total of 276 ballots from 19 states. Given the high vote counts for each of the six candidates on this year’s ballot, I would strongly encourage those who didn’t make it this year to run again next year.

Your 2019-2020 Executive Committee will begin its work at the NLPES Professional Development Seminar in Park City, Utah with the following 12 members:

  • Erik Beecroft, Virginia Joint Legislative Audit and Review Commission
  • Patricia Berger, Pennsylvania Legislative Budget and Finance Committee
  • Jon Courtney, New Mexico Legislative Finance Committee
  • Emily Johnson, Texas Sunset Advisory Commission
  • Mary Jo Koschay, Michigan Office of the Auditor General
  • Karen Leblanc, Louisiana Legislative Auditor’s Office
  • Kiernan McGorty, North Carolina Program Evaluation Division
  • Paul Navarro, California State Auditor’s Office
  • Kristen Rottinghaus, Kansas Legislative Division of Post Audit
  • Shunti Taylor, Georgia Department of Audits and Accounts
  • Eric Thomas, Washington Joint Legislative Audit and Review Committee
  • Linda Triplett, Mississippi Joint Legislative Committee on Performance Evaluation and Expenditure Review


We work to support you in your important work as legislative program evaluators and performance auditors. Please share your suggestions and ideas as to how we can better serve you in the year ahead.

And the Winner Is... | Shunti Taylor (Georgia)

Congratulations to the 2019 NLPES Award recipients! NLPES awards recognize exceptional performance among our offices. This year’s award winners are:

  • Kevin Levine (Texas) for Outstanding Achievement Award;
  • Louisiana Legislative Auditor for Excellence in Evaluation;
  • Kansas Legislative Division of Post Audit, Louisiana Legislative Auditor, and Virginia Joint Legislative Audit and Review Commission for Excellence in Research Methods; and
  • Certificates of Impact were awarded to 28 offices in 26 states.

For a complete list of award winners and award-winning reports, visit the NLPES awards webpage. All awards will be presented during the awards luncheon at the NLPES Professional Development Seminar in Park City, Utah in September.

Thanks to all the offices that submitted applications for awards. And a special thanks to our judges--your contribution to the success of this year’s awards cycle is greatly appreciated.

Stop the Presses! Performance Reports in the News | Joshua Hooper (California)

Program Evaluation Leading the Way!

2019 started off with no shortage of examples of program evaluation in the news. Here is just some of the media coverage our offices received from the first half of 2019. These articles show how program evaluation is making a difference throughout the country.

ARIZONA: Office of the Auditor General

Audit finds classroom spending up in AZ, teacher pay below national average

Auditor General: Hit by Recession, Arizona's Water Department 10 Years Behind

Arizona not inspecting marijuana edibles kitchens, audit finds

__________________________________

ARKANSAS: Division of Legislative Audit

Audit of health marketplace finds clean transfer to agency

__________________________________

CALIFORNIA: State Auditor

Audit: California State University Stashed Away $1.5 Billion, Raised Tuition

Investigations of alleged misconduct by California judges fall short, audit finds

Assemblymember Rudy Salas holds joint oversight hearing on Medi-Cal

___________________________________

COLORADO: Office of the State Auditor

Colorado state auditor: Inspection program must prioritize high-risk wells

State Audit Finds Critical Problems With Colorado Department of Transportation’s Budget

Audit: Youth Services inflated results

___________________________________

CONNECTICUT: Performance Audit Unit

Auditors: State may be handing out Medicaid to ineligible people

___________________________________

DISTRICT OF COLUMBIA: Office of the D.C. Auditor

D.C. Auditor Report Suggests Government Money Intended for Specific Purposes Has Been Misused

Conditions In The D.C. Jail Are Unsafe And Unsanitary, D.C. Auditor Says

Audit: Bowser administration awarded millions to developers with low-ranking proposals

___________________________________

FLORIDA: Office of Program Policy Analysis & Government Accountability

New report shows spike in sex offenders, predators in Florida

Legislative report identifies prison population reduction strategies

___________________________________

GEORGIA: Georgia Department of Audits and Accounts

Online charter schools continue to struggle, state reviews show

Audit Finds University System Shorted the Teacher Retirement Fund

Audit: Georgia DOT should do more to justify road projects

___________________________________

HAWAII: Office of the Auditor

Audit finds instances of potential fraud at recycling redemption center

State Auditor Blasts DAGS For Failure to Verify Rail Invoices

___________________________________

ILLINOIS: Office of the Auditor General

Audit finds DCFS overwhelmed, underperforming as lawmakers promise to fix it

Audit: Slow Legionnaires’ response at Illinois veterans home

State health portal costing, not saving, money, audit finds

___________________________________

KANSAS: Legislative Division of Post Audit

Audit: Tweaking Kansas sales tax law may secure $70 million in new revenue

___________________________

LOUISIANA: Legislative Auditor

Legislative Auditor: Louisiana needs easier way to dissolve dying cities

Louisiana Has Fragmented Response to Elderly Financial Abuse

Auditor: LSU may have violated state Constitution in handling of contract with private company

___________________________________

MAINE: Office of Program Evaluation and Government Accountability

Report ordered after deaths of 2 girls highlights challenges of child protective caseworkers

___________________________________

MARYLAND: Office of Legislative Audits

Audit finds conflict of interest in Maryland Transit Administration contracts

Audit: Firms of State Venture Fund's Advisers Got $21M

Auditor’s report critical of Maryland’s State Highway Administration

___________________________________

MICHIGAN: Office of the Auditor General

Auditor General Report Questions Michigan Strategic Fund and MBDP Jobs Program

State gave out $2.3M in ‘inappropriate’ public assistance, child support audit finds

State partly complies with post-Flint recommendations

___________________________________

MINNESOTA: Office of the Legislative Auditor

Auditor: No evidence MN child care aid funded terrorism

Legislative auditor: Agency leaders fell short launching MNLARS

Legislative auditor calls for shift in Minnesota's child-care fraud investigations

___________________________________

MISSISSIPPI: Legislative PEER Committee

PEER examines daily cost to house inmates and it has real implications for private prison operators

Legislative assessment of juvenile justice programs prompts changes

__________________________________

MONTANA: Legislative Audit Division

Audit Finds Montana Secretary of State Misused State Vehicle

Audit spurs skepticism of university system spending

__________________________________

NEBRASKA: Legislative Audit Office

Audit: Two tax credits aren't working as intended

__________________________________

NEVADA: Legislative Counsel Bureau Audit Division

Nevada missed $500K in tax revenue from marijuana industry, audit shows

___________________________________

NEW MEXICO: Legislative Finance Committee

Report Says Shrinking Ridership Hampers New Mexico Commuter Rail

___________________________________

NORTH CAROLINA: Program Evaluation Division

WATCHDOG REPORT: NORTH CAROLINA HURRICANE FUNDS DELAYED

N.C. state lawmaker talks changes to ABC system after new report released

State evaluators want North Carolina's economic development agency to make changes

___________________________________

OKLAHOMA: Office of State Auditor and Inspector

Investigative Audit Shows Grady County Officials Were Overpaid For 11 Years

Audit Questions $10 Million in Fund Use by Oklahoma DHS

State Auditor Office Releases Chickasha Grade Tampering, Misuse Of Funds Findings

___________________________________

OREGON: Audits Division, Secretary of State

Oregon Human Services in-home care still lags behind audit benchmarks, report says

Oregon marijuana regulators fail to meet even basic standards, state audit finds

Oregon child welfare agency fails to complete most recommendations in 2018 state audit

___________________________________

RHODE ISLAND: Auditor General

Auditor: State Paid $11 Million in Medicaid to Dead People

___________________________________

SOUTH CAROLINA: Legislative Audit Council

A Charleston resident won the lottery 125 times. State auditors think something’s up

___________________________________

TENNESSEE: Administration and Performance Audit Division of State Audit

Audit: Leaks Found at New $160M Tennessee State Museum

___________________________________

TEXAS: State Auditor’s Office

Manley Says Austin Police Agree With Much Of State Audit Of How It Classifies Rape Cases

114 years of waiting: Callers kept on hold by Texas state agencies

___________________________________

TEXAS: Sunset Advisory Commission

TEXAS IS ABOUT TO BECOME THE WILD WEST OF PLUMBING: 'WE CAN ALL BECOME PLUMBERS'

Lawmakers Need to Decide Nature of Texas Windstorm Insurer, Sunset Commission Says

___________________________________

UTAH: Office of the Legislative Auditor General

'Broken system' puts University of Utah lab workers at risk, audit says

State audit concludes Tooele County mishandled sale of the late Larry Miller’s old raceway, cost taxpayers millions

Utah’s suicide hotline sees an increase in calls but not funding

___________________________________

VIRGINIA: Joint Legislative Audit and Review Commission

JLARC review sharply critical of STEP-VA implementation

Tax break helps make Virginia the leader in data centers; JLARC suggests tweaks, not big changes

___________________________________

WEST VIRGINIA: Post Audit Division

Audit: West Virginia government doesn’t know how many guns it has

Audit shows Jobs Act has done little for state unemployment

___________________________________

WISCONSIN: Legislative Audit Bureau

Audit finds ongoing problems with WEDC, tax credits for jobs created in other states

Audit: UW Tuition Revenue Grew $366.6M Over Last Decade

Audit Bureau: DOT Still Not Complying With The Law

___________________________________

Share your coverage with us! If you would like us to highlight media attention about your reports in our next newsletter, send the hyperlinks to Emily Johnson.

Check It Out: Websites, Professional Development and Other Resources

Coming Soon to the NLPES Home pageKeep an eye on the NLPES home page as you will soon see the appearance of the Report Library to assist you in finding recent reports from other states. The Report Library lists the titles of recent reports by subject area that may be of interest in your work. We will be testing it with reports released in 2018 by the agencies represented on the NLPES Executive Committee. If your response to using the Report Library is positive, we will expand it to include all offices and all reports released in the last five years.

Ready for your close-up? Continue to check the NLPES home page for the new photo ribbons from the New Orleans PDS. Thank you to all who participated.

NLPES Listserv—The NLPES listserv is an email discussion group for NLPES members. By sending a message to the NLPES listserv, you can reach all listserv subscribers simultaneously. Listserv members can query other states about evaluation work similar to their own projects, receive announcements about performance evaluation reports and job opportunities from other states, and are notified when the latest edition of this newsletter is available! To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv. See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.

Are you receiving our listserv emails? Some states’ systems block NLPES listserv emails. If you think you are not receiving our emails, please check your state’s security system and spam filters, and/or contact Brenda Erickson.

Legislative careers websiteKnow someone thinking about pursuing a career with a state legislature? Point them to the opportunities posted on NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs. Attracting young people to work as legislative staff will be increasingly important in the coming years. And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career.

NLPES' Professional Development ResourcesVisit our NLPES online training library for a variety of refresher and training materials! There are nearly two dozen resources on planning and scoping, fieldwork, writing and publication, and management. Most are PowerPoint slides; some are narrated; a few are webinars or podcasts. Check them out.

Ask GAO LiveAskGAOLive is a 30-minute interface where GAO staff chat about a specific report and research, and answer questions that are emailed or tweeted in. Sessions are recorded and archived on the website. You can also “follow” GAOLive to receive advance notice of chat sessions. Topics include veterans and higher education, prescription drug shortages, prison overcrowding, state and local fiscal outlook, and government contracting.

Ensuring the Public TrustWhat’s the most common internal performance measure for evaluation shops? How many offices tweet? What percentage of staff has fewer than 10 years of experience? How can you contact a sister office in another state? Ensuring the Public Trust, summarizes information about legislative offices conducting program evaluations, policy analyses, and performance audits across the country.

Loading
  • Contact NCSL

  • For more information on this topic, use this form to reach NCSL staff.