The Working Paper is the official newsletter of the National Legislative Program Evaluation Society. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Summer is upon us, marking the beginning of well-deserved vacations, more time with family, and at least for some of us, unbearable humidity! For NLPES, summer is also the start of our most intensive training activities. For those of you attending the NCSL Summit in Chicago Aug. 8-11, look for the many events and sessions designed specifically for legislative staff, including a timely workshop on evaluating tax incentives, which is co-sponsored by NLPES.
We are also coming up on the annual Professional Development Seminar (PDS), which will take place Sept. 25-28 in Jackson, Miss. Our friends in the Mississippi PEER office have worked hard to develop interactive sessions that will provide new evaluators with basic skills, strengthen our technical skills, and help us realize our potential. The deadline to register is Sept. 16, and hotel rates are guaranteed through Sept. 5. We hope you will join us in Jackson to learn from our shared experiences, network with your peers across the country, and importantly, understand that we are not alone in dealing with complex problems facing state government.
This is the time of year when we pause to recognize the exceptional work that you and your offices do for the public good. The winners of the 2016 NLPES awards will be announced shortly, and awards will be presented at the PDS in September. Thank you to all the judges for their time and effort. And thanks to each of you for your dedication to improving state government.
Elections took place this spring to fill four upcoming vacancies on the NLPES Executive Committee. Congratulations to Greg Fugate (Colorado) on being re-elected to the Executive Committee, and welcome to Melinda Hamilton (Michigan), Emily Johnson (Texas), and Kiernan McGorty (North Carolina). We look forward to the talent, enthusiasm, and fresh ideas they will bring to the committee.
As I conclude my term as chair, I want to thank you for the opportunity and tell you what a pleasure it has been to work with you. It has also been a privilege to work alongside Executive Committee members whose dedication, initiative and knowledge are truly inspiring. During our PDS, Greg Fugate from the Colorado Office of the State Auditor will take over as chair. I have had the distinct pleasure of working with Greg for several years, and am delighted that he will be sharing his natural leadership, enthusiasm, and wealth of knowledge and experience to an even greater degree in this new role.
I hope to see many of you in Jackson for the NLPES Professional Development Seminar.
Nathalie Molliet-Ribet is the 2015–2016 NLPES Executive Committee chair. She can be reached at email@example.com
Summer salutations auditors and evaluators of the world! At this time of year the Report Radar inbox regularly gets clogged with pleas for advice on summer reading recommendations. All of us know that there is nothing more relaxing on a long beach vacation than immersing yourself in a great audit/evaluation report from one of our member offices. Featured in this edition are reports on some of our favorite topics, including Medicaid, the judicial branch, workforce development, education, and taxation. Whether you read one or all of these marvelous reports, your time away from the office will have been well-spent.
We begin with a couple of reports looking at Medicaid issues. First up is a February report from our people in Florida, which discuses reorganization of the state’s Medicaid program integrity functions. The report focuses on efforts to improve oversight and combat fraud and abuse in Medicaid managed care plans and highlights activities relating to recovering overpayments, and investments in technology systems and data analytics. If Medicaid is in your future, we would also recommend a March report from North Carolina. This report looks at Medicaid eligibility determinations, with a focus on the timeliness of eligibility decisions and relationships between timeliness and workload.
Next is a duo of interesting reports addressing different functions within the judicial branch of government. If your office has authority or responsibility relating to the state bar, then you will be interested in a May report from California discussing the organization that licenses and regulates those practicing law. This report discusses various issues with financial reporting and executive salaries, as well as other improvements in transparency. Our other judicial branch report comes from Colorado and addresses the state’s Independent Ethics Commission, which deals with ethics complaints against public officials. Many states have these kinds of entities to address ethics complaints, although not all are located within the judicial branch. It is an interesting report and well worth a look if you are going to review a similar organization.
Our workforce development section for this issue features three different reports, all released in April. Washington had a look at the state’s Training Benefits Program, which provides additional unemployment benefits for people training for high-demand jobs. Part of the report included analysis of the effects of earnings over time for program participants. Outcomes from workforce investment programs were also a focus of a report from New Mexico. This report is the fourth in a series and addresses local boards that are responsible for implementing state job training and employment programs. Further emphasis of the importance of outcomes-based evaluation of workforce programs is seen in a report from New Hampshire, which assessed the WorkReadyNH program, which offers skills training through the state’s community colleges. This performance audit found issues with a lack of consistent data and unclear statutory definitions for the program’s mission, goals, and objectives.
This month’s education section features four different reports on diverse subjects. In March, our friends in Minnesota released a report looking at teacher licensure, which identified concerns with confusing and poorly defined statutory standards and inconsistent decision-making in the process for licensing teachers and other education professionals. Also released in March was a report from Arizona that looked at the state’s K-3 Reading Program, designed to improve reading proficiency for students through the third grade. Recommendations in this report addressed the need for improved program implementation and consideration of statutory changes relating to program oversight. We also have two education reports to recommend that discuss issues relating to virtual schools, which primarily offer classes or content online. In February, Georgia released a report addressing the state’s virtual charter schools, which discusses various issues, including measures of academic performance and different funding models for virtual charter schools. The Tennessee Office of Research and Education Accountability has also tackled the issue of virtual schools and released a report in March discussing how school districts operate and fund these schools.
We finish this edition with a look at some taxation-related reports. Both California and Louisiana recently reported on the issue of tax expenditures and exemptions/exceptions, and how legislative scrutiny of measures that effectively reduce tax revenues can be improved. The California report was released in April and focuses on corporate tax expenditures, which are tax benefits for qualifying corporations that can have the same effect as government spending programs. These expenditures totaled more than $5 billion and the report discusses improvements in legislative oversight of certain credits or exemptions to ensure they are being evaluated for effectiveness. Similar themes emerge in the Louisiana report, which discusses a wide variety of exemptions or exceptions in the state’s tax code, which in aggregate are projected to result in exemptions exceeding revenue collections in FY 2015. The Louisiana report also addresses the potential for improved legislative oversight and discusses the experience of other states that have implemented more structured approaches to these tasks.
Great reports, people! After all this hard work, we deserve some relaxation, so Report Radar wishes everybody a restful summer. We will see you back here, recharged and ready to go in the fall edition.
Angus Maciver is the Legislative Auditor for the Montana Legislative Audit Division. He can be reached at firstname.lastname@example.org
Our Goal: To provide members of the Pennsylvania General Assembly with accurate and unbiased information and analysis to inform their policy decisions.
Our Governing Body: The Legislative Budget and Finance Committee (LBFC) is a bipartisan, bicameral legislative service agency consisting of 12 members of the General Assembly. The membership is divided equally between the House and the Senate and Republicans and Democrats. The members elect their own officers who represent the four caucuses. Members are appointed by the leadership in both chambers for a two-year legislative session.
Our Charge: The LBFC was established by statute in 1959 to conduct studies and make recommendations aimed at eliminating unnecessary expenditures; promoting economy in the government of the commonwealth; and ensuring that state funds are being expended in accordance with legislative intent and law. From 1981 to 1991, the LBFC was also responsible for conducting Sunset Audits under the Sunset Act. More recently, our projects have veered into prospective policy analysis and policy evaluations.
Projects are assigned:
Our Staff: The executive director is appointed by the committee and is responsible for the direction of the committee's staff and activities. In addition to our executive director, we have 10 staff including people with graduate degrees in public administration, business administration and law, and undergraduate degrees in accounting, economics, history, political science and labor studies. We function primarily as generalists and the staff's experience covers a wide range of topics, including among others: health and welfare, transportation, economic development and law enforcement.
Our Process: After we receive a project, it is assigned to a project team based on subject matter expertise/familiarity, subject matter interest, or (my personal favorite) staff availability. Once a draft report is completed, our statute requires the draft to be provided to the auditee for its review and comment. These comments are included as an appendix to the final report and run the gamut from a polite policy statement highlighting the positive aspects of our review to detailed disagreement with our analysis and recommendations, although they are generally somewhere between the two extremes.
Our reports remain confidential until they are released at a public meeting of our committee to which the agency under review is invited to participate. After that, our reports are available to the public on our website or by contacting our office.
Our Aftermath: The release of a report ends our direct involvement with the agency or program that was the subject of the report. At this point, any additional actions related to the project are responsive to requests. For example, we testify at committee meetings, respond to press requests, speak to interest groups, and review draft legislation related to our findings and recommendations. We do not, however, “lobby” for our report recommendations.
Our NLPES Involvement: Our staff attend and present work at NLPES’s annual professional development seminars, served and are serving on the NLPES Executive Committee, and hosted the annual professional development seminar in 2005. Several of our reports have received NLPES awards.
Our Secret Weapon: We have subpoena power . . . but we have never had to use it.
Patricia Berger is senior counsel and project manager at LBFC. She can be reached at email@example.com.
Photo credit: The photo was taken by Doug Gross, Pennsylvania Senate Republican Communications.
The Role of Research Methodologist | Edward Seyler (Louisiana)
As research methodologist for the Louisiana Legislative Auditor (LLA), my goal is to help teams analyze issues in state government that require quantitative research methods. All government programs can be analyzed according to some kind of logic model that explains how they turn their inputs into the desired outputs or eventual outcomes. For most programs, auditors do not need to use advanced statistical techniques to put together a logic model that explains how a program could be improved. Nevertheless, some programs operate according to logic models that can only be fully understood with the use of advanced quantitative methods. We can now use these methods to assist in answering our audit objectives.
One such example is our 2015 performance audit on the Louisiana Lottery Corporation. The Louisiana Legislature asked us to evaluate whether the lottery was operating in a manner that would maximize proceeds to the state. After we began our audit, we discovered that the lottery had reduced its costs by switching vendors and had reinvested the savings by embarking upon a five-year plan to increase scratch-off prizes. Auditors wanted to know whether this had actually increased proceeds for the state. To answer this question, we developed an economic model that could explain how demand responds to changes in prizes, and we used a regression analysis to measure how much demand would increase for a given increase in prizes. We found that higher prizes had indeed led to higher proceeds for the state, and that further increasing prizes could increase proceeds for the state by another $3.8 million each year.
In a more recent example, our 2016 performance audit on Louisiana’s inventory tax credit found that inventories in Louisiana have, in recent years, grown faster than what can be explained by underlying economic conditions, such as growth in inventory-intensive industries, growth in national inventories, or changes in oil prices. This excess growth helps local governments, which collect ad valorem property taxes on inventories; but it poses risks to the state, which grants businesses a refundable tax credit against their state income tax for the amount of ad valorem property taxes paid on inventories. For the 2014 tax year, our regression analysis estimates that the discrepancy between actual and expected inventories cost the state $47.8 million in lost income tax revenues. The audit also points out specific weaknesses in the administration of the credit that create opportunities for businesses to reclassify property as inventory so that they can claim the credit.
My experience has been that I can be most useful to teams by being able to wear many hats, operating sometimes as a research methodologist, at other times as an auditor. The fact that I worked as a staff auditor for a year—and as an intern for a year before that—helps in this regard. Sometimes, teams need for me to be a research methodologist and run regressions or use calculus to derive first-order conditions for an optimization problem. At other times, teams need for me to be an auditor and document what I did in a workpaper or develop language for the final report. Audit teams typically conduct their background research, identify issues, and produce a draft audit plan before formally meeting with me. We make adjustments to the audit plan as needed, but from that point forward I function as a team member.
Our goal for the future is to continue to think of new ways to use quantitative research methods in performance audits. Auditors who work on projects that use quantitative methods gain familiarity with using these methods and suggest new areas that could benefit from quantitative analysis. We hope to build on this momentum moving forward so that we can provide the Legislature and the public with a deeper understanding of the most important issues facing the state.
Edward Seyler is the research methodologist at LLA. He can be reached at ESeyler@LLA.La.gov.
Colleage Readiness of Tennessee Students | Russell Moore (Tennessee)
Has your office ever set out to study an issue only to find out during your research that fundamental data were not available? It happened to Tennessee’s Office of Research and Education Accountability (OREA) last year during a study of college readiness and remediation in higher education. This edition of “What We Do” explains how OREA tackled this problem by developing a research methodology to produce formerly unavailable comprehensive statewide data on the percentage of students who were not academically ready at each public community college and university in Tennessee.
College readiness and remediation in higher education was identified by OREA last year for an office-generated project based on sustained interest from state legislators and policymakers. National research shows large numbers of students graduate from high school not academically ready for postsecondary education and require remediation coursework. An early, basic step in the project, so we thought, was to obtain data on the number of students who required remediation in Tennessee’s community colleges and public universities. We were surprised to learn, however, that comprehensive statewide remediation data were simply not available.
This was obviously a problem but it also presented us with an opportunity to design a research methodology to answer the following fundamental question: What percentage of freshmen at each of Tennessee’s community colleges and public universities are academically ready for college? The research approach we took to answer this question rested on two pillars: the college readiness criteria at each higher education institution and ACT scores.
College readiness criteria were uniform across all the state’s community colleges and most of its public universities because these institutions were all governed by the same higher education governing body - the Tennessee Board of Regents - which used the same criteria for each institution. This was not the case, though, for the state's other higher education governing body, the University of Tennessee system. Each of the three university campuses in the UT system set their own criteria, so we worked with the academic staff at these campuses to determine where each of them set the bar to indicate readiness for college-level courses.
The exhibit below provides the college readiness thresholds for the Tennessee Board of Regents schools and the three University of Tennessee schools, as well as the college readiness benchmarks set by ACT.
College Course or Area
ACT Subject-Area Test
ACT College Readiness Benchmarks
Tennessee Board of Regents Criteria
UT Knoxville Placement Criteria
UT Chattanooga Placement Criteria
UT Martin Placement Criteria
19 + Composite 19
(a) UT Knoxville has indicated that 25 is the minimum ACT math subject score needed for placement in calculus.
(b) NA indicates not applicable, i.e., no criteria exist at those institutions.
We were now ready for the second stage in our methodology: obtaining and analyzing the ACT scores of first-time freshmen at each institution. We requested and received ACT scores for the fall cohort of first-time freshmen at each institution for the past five years (in the end, we decided to use only the data for the fall 2014 cohort for this report due to the sheer amount of information involved, available staff, and our timeline) from the Tennessee Higher Education Commission.
A number of steps were necessary to rigorously sift through this ACT dataset before we could use it for our purposes. First, we cleaned the data by excluding students who lacked ACT subject area scores (e.g., math, English, or reading). Even if these students had a composite ACT score, they were removed from the dataset since only subject area ACT scores are used by Tennessee’s public higher education institutions for remediation placement. At the conclusion of this process, a high percentage of the total cohort remained, ranging from 79 percent for community colleges to 99 percent for the University of Tennessee system. Then, we used Excel functions to identify student need for remediation by coding a student’s subject area score against their institutions’ college readiness criteria. This allowed us to estimate the number and percent of students at each institution who indicated a need for remediation in each subject area.
With the cleaning and coding completed, we were ready to produce a college readiness percentage for each public higher education institution in Tennessee.
OREA’s report listed the percentage of students who did not meet each institution's threshold for the three ACT subject areas of math, English, and reading, as well as an overall college readiness percentage based on the percentage of students below the threshold for any of the three subject areas. Here is a portion of a key exhibit from the report with the college readiness percentages for Tennessee’s community colleges.
Percentage of first-time freshmen not meeting college readiness criteria at Tennessee’s Community Colleges, 2014
# First-Time Freshmen
Chattanooga State Community College
Cleveland State Community College
Columbia State Community College
Dyersburg State Community College
Jackson State Community College
Motlow State Community College
Nashville State Community College
Northeast State Community College
Pellissippi State Community College
Roane State Community College
Southwest Tennessee Community College
Volunteer State Community College
Walters State Community College
Total: TBR Community Colleges
Our report also describes recent changes made to postsecondary remediation and outlines a few key progress measures, such as the percentage of students assigned to remediation that:
The heart of the report, however, was the comprehensive statewide college readiness data we produced for each public higher education institution in Tennessee. This information served as a proxy measure for the need for remediation among Tennessee public college freshmen. In addition to filling a fundamental gap in the college readiness and remediation data, our research methodology better positioned OREA to track these trends going forward on behalf of Tennessee’s state legislature.
Russell Moore is director of OREA. He can be reached at Russell.Moore@cot.tn.gov.
The following article appeared in the February 1999 issue of the NLPES News. The article was an interview of three program evaluators who had migrated to “the other side”—i.e., executive branch agencies. Based on their experiences in both branches of government—legislative and executive—the evaluators offered the following insights.
Recommendations Are Easier to Write Than to Implement. It’s exciting to identify a complex problem, but trying to come up with a good, workable recommendation of specific ways to fix the problem is likely to be elusive due to the complexity of possible solutions (and lack of expertise in the subject). “Fix it” is not a good recommendation. If it were that easy, it would have been fixed already.
Focus on the Real World. Many in program evaluation have never worked in an executive agency and have no idea how difficult it can be to get the job done “efficiently and effectively.” There is a theory of how things should work and the reality of what it takes to make things work on a day-to-day basis. Aim for the ideal, but make recommendations based on a cultivated sense of the “real world.”
Relevancy Matters. “Too little,” “too late,” and “too negative” are oft-heard phrases about evaluations. Most people want to improve, but they don’t feel communication with legislative program evaluators is a two-way street. They give, give and give information into a vacuum and then a report (usually negative) spits back out at them months later. More immediate and more direct feedback is preferred, as well as more useful and relevant recommendations versus pie-in-the-sky ideas.
Show Some Appreciation. Trying to implement program suggestions and audit findings is much more difficult than you imagine. Agencies are trying to do a lot of things and you might be pleasantly surprised at how interesting some of them are.
Sharpen Your Perspective. The executive branch is neither a “united front” nor one big happy family. Information provided from various sources during the course of an evaluation will conflict and it should. All sorts of internal agency politics need to be kept in mind during the course of an evaluation.
Bad Apples (Don’t) Break Programs. Recruiting and retaining good managers on state government salaries is difficult. While agencies are lucky to keep outstanding managers, it only takes a couple of bad ones to detract from the agency’s accomplishments.
About Legislative Intent. What happens in the capitol is often not understood within the executive branch. It is not that legislative intent is ignored, but elements can be lost in translation. Usually, willful misinterpretation is much less at fault than human error. Versions of legislative intent differ wildly and are not necessarily at the forefront of the minds of people who are implementing programs.
This is the sixth in a series of ‘oldie but goodie’ articles culled from past issues of the NLPES newsletter.
Every Student Succeeds Act | Russell Moore (Tennessee)
We all know that state legislators are busy. Between committee hearings, meetings with lobbyists, talks with constituents, and floor sessions, they don’t always have time to read the latest policy reports. So, the Office of Research and Education Accountability (OREA)—the research, evaluation, and analysis unit housed in the Tennessee Comptroller’s Office—faced an even bigger challenge than usual last December.
We wanted Tennessee’s legislature to have all of the important facts about the new, 400-page federal education law, the Every Student Succeeds Act (ESSA). We also wanted to explain ESSA’s predecessor, the 700-page No Child Left Behind Act (NCLB), and the state’s two NCLB waivers, both of which were challenging enough on their own to explain in concise language. But how were we going to cram 1,000+ pages of information into one document that would stand a chance of gaining the attention of time-pressed legislators.
We decided we needed an innovative format. Rather than only offering our ESSA analysis as a lengthy (50+ pages) PDF document on our website, we created an online, “choose your own adventure” policy portal.
We adopted what might be called an “iceberg approach” to organizing the policy portal so that readers determine how much (or how little) they’d like to learn about the topic; it’s up to them. Legislators who don’t focus on education—or who are running to a meeting—can read just the “tip of the iceberg,” a few sentences outlining the highlights of each issue, all on one page. But those legislators and stakeholders who focus on education can click on “Read More” to get their feet wet with a more in-depth summary of each topic. And education fanatics can strap on their scuba gear and dive right to the bottom of the iceberg to reach the nitty gritty of each section, complete with citations to the law itself.
We also split our ESSA policy portal into six key, distinct categories of education law: accountability, testing, intervention, teachers, funding, and innovation. By doing so, readers can go straight to the category they’re interested in. Number crunchers can dive into the dollars and cents without reading the details about school performance, and those following the testing debate can read all about the new testing requirements without slogging through a history of teacher qualifications.
By organizing information from most general to most specific and using a limited number of key categories, we aimed to make our analysis of this complex federal education law as easy to navigate and digest as possible. We hope to keep the interest of harried legislators long enough for them to absorb the important points, while still providing a detailed resource for stakeholders and policymakers.
The ESSA policy portal also represents our first union of multiple formats – we’ve combined text, graphs, charts, and infographics that explain complicated concepts visually. In the near future, we hope to replace our static graphs with interactive, clickable charts to provide even more information about school funding and trends in a visually appealing format that holds the reader’s attention.
In the past, OREA has published downloadable PDF reports on our website (and the ESSA analysis will also be available in PDF format for those who prefer a 50+ page document, and some readers do), but we deliberately produced our ESSA project in HTML in recognition of the increasing use of mobile devices by legislators and legislative staff.
By presenting the ESSA analysis in HTML, adopting the “iceberg method” of organization, and integrating text and graphics, we now have a new method of communicating our work to busy legislators. The response to our ESSA policy portal has been positive so far, and we intend to use this format for future OREA research, evaluation, and analysis projects.
OREA’s policy portal can be accessed here.
Russell Moore is director of OREA. He can be reached at Russell.Moore@cot.tn.gov.
Whiteboard Sessions | Valerie Whitener (Washington)
Have you ever become so immersed in the details of your study that you lost sight of the bigger picture? Has a colleague ever suggested an alternative study approach or potential data source that was really helpful … but you received it after you already completed the report?
JLARC study teams address these potential pitfalls with “whiteboard sessions.” These sessions give analysts an opportunity to get advice and ideas from peers and management during the study process. Whiteboard sessions are a key piece of our office’s quality control process and are consistent with a key office principle that studies are products of the entire office.
Timing, purpose, and approaches vary:
Timing: Teams have the flexibility to schedule one or more sessions as needed. In general, however, teams hold whiteboard sessions during preliminary scoping and planning, or near the end of field work.
Purpose: Some teams use whiteboards to help narrow a study’s scope, help identify themes within their initial findings, or help determine the best way to tell the audit story. Sessions tend to be most successful when the audit team clearly identifies the purpose and questions for their colleagues to explore at the whiteboard. That said, the best input can be where the team least expects it. On a recent study, a simple hand-drawn process map resulted in a whiteboard discussion that became the basis for a study message comparing the agency process to best practices.
Approach: Some teams literally use a whiteboard for group brainstorming, while others use it to test presentations or generate discussion. Regardless, the focus is on having a respectful, supportive, and inquisitive discussion in an informal setting. Analysts have learned that peers are more likely to fully participate when the team provides snacks!
Analysts find these sessions to be a useful complement to their supervisor’s more formal review process. With careful planning and sufficient snacks, we think you’ll find them useful too.
Valerie Whitener is an audit coordinator at JLARC. She can be reached at firstname.lastname@example.org.
As summer approaches, please take a moment to mark your calendars for the 2016 NLPES Professional Development Seminar. From Monday, Sept. 26, through Wednesday, Sept. 28, join us for a “Deep Dive in the Deep South: Exploring State of the Art Tools of the Trade” in Jackson, Mississippi’s capital city. Legislative program evaluation and audit professionals from across the country will convene at the beautifully restored King Edward Hotel, now operating as the Hilton Garden Inn Jackson Downtown, a historic landmark that opened in 1923 in downtown Jackson.
Hosted by NLPES and the Mississippi PEER Committee staff, the seminar will provide staff of all experience levels with new and useful skills through the following three session tracks:
The seminar session tracks are designed to qualify for sixteen hours of CPE credit.
Networking activities throughout the seminar will give evaluation and audit professionals the opportunity to keep learning outside the session rooms. Situated in the heart of Mississippi, the City with Soul represents the best of Mississippi’s culture and blues tradition. Jackson is where Southern charm meets city life. With a population under 200,000, Jackson is small enough to feel like a close-knit community, but large enough to host abundant choices in entertainment, dining, shopping, and more.
Registration is open! In addition, information is posted on the NLPES website. Enjoy the rest of summer and then join us in Jackson in September. It promises to be a great year to take a Deep Dive in the Deep South.
Two long-serving senior staff left Washington JLARC in the first months of 2016 for the greener pastures of retirement.
Georgia’s Performance Audit Division is pleased to announce the addition of three talented new staff members. Rhett Garland joins us from Florida where he had 10 years’ experience working as an economist performing analyses on a range of labor market and workforce development topics. Kristopher Bonnejonne and Sam Johnson are recent graduates of Georgia Tech’s International Affairs program and the University of Georgia’s MPA program respectively. Both have experience in research and analysis. We look forward to the many positive contributions they will make!
Please let us know if you have staff happenings to share! E-mail email@example.com.
Based on the amount of press regarding reports they issued, our member offices certainly were busy during the first few months of 2016. Way to go!
The following summaries describe but a few of the reports the media covered from January 2016 through May 2016 (plus or minus a day or so on either end): potential misspending of cruise ship tax revenues, weak reading improvement program, expanding pay gap between male and female county workers, vets home abuses not investigated, skyrocketing Medicaid costs, replacing old computer systems, overpaid vendors, and ignoring water pollution rules.
Share your coverage with us! If you would like us to highlight media attention about one of your reports in our next newsletter, send the hyperlinks to Shunti Taylor at firstname.lastname@example.org.
Audit says some Alaska towns need to track cruise ship passenger taxes better
April 15, 2016 – Alaska Dispatch News – Some Alaska communities have spent money raised from a cruise ship passenger tax without proper documentation to show it was used in accordance with state law. That's one of the main findings from an audit released Thursday from the state Division of Legislative Audit.
Auditor General: Arizona's Move On When Reading program lacks structure, oversight
March 30, 2016 – AZ Central – The reading program intended to help struggling Arizona students become better readers by the time they reach fourth grade lacks structure and oversight, according to state auditors.
Office of the Auditor General [Click for full report]
Pay gap growing between men and women working for L.A. County
May 31, 2016 – Los Angeles Times – A state audit released Tuesday found a significant – and growing – gap in the average pay for men and women working for several large California counties, including Los Angeles. Women in Los Angeles County’s workforce made on average 76% of what their male counterparts made last year, down from 80% in the 2011 fiscal year. The audit by the California State Auditor also looked at the pay gap in Orange, Fresno and Santa Clara counties between the 2011 and 2015 fiscal years.
State Auditor [Click for full report]
Audit: Ga. Regents unclear on whether tuition waivers achieving goals
May 18, 2016 – AJC.com – The state’s Board of Regents has made some improvements in handling out-of-state tuition waivers for Georgia’s public colleges since a state audit reviewed its policies in 2013. Despite those improvements, auditors say the Board has not determined whether those waivers — which offer thousands of dollars in tuition savings for some students — are achieving their intended purpose.
Department of Audits and Accounts, Performance Audit Division [Click for full report]
Audit: Hawaiian Home Lands Improving
April 29, 2016 – Honolulu Civil Beat – The Department of Hawaiian Home Lands has taken steps towards acting on 15 of the 20 recommendations made by the state auditor in 2013, but has not actually implemented any of them, according to a follow-up report released Friday.
Office of the Auditor [Click for full report]
Idaho Medicaid-Funded PSR Costs Skyrocketed Ninefold in 10 Years
Jan. 20, 2016 – Boise Weekly – The official title is Psychosocial Rehabilitation Counselor, but they're better known as PSR workers. They work with clients diagnosed with mental health or emotional disorders, but instead of clinical or inpatient treatment, they focus on their clients' well-being in social situations such as work, school or while shopping. However, according to a new report from the Idaho Legislature's Office of Performance Evaluations, some Idaho caregivers are accused of "overreliance on or misuse of PSR," and it is a "major concern" for the Idaho Department of Health and Welfare. A decade-long analysis of Medicaid-funded care reveals Idaho costs for PSR increased ninefold—from $8.3 million in 2001 to $76.1 million in 2012.
Office of Performance Evaluations [Click for full report]
Audit Report Blasts Management Of Anti-Violence Programs
April 19, 2016 – NPR / Illinois – Illinois' Auditor General Frank Mautino said millions of dollars meant for state anti-violence programs are unaccounted for. Gov. Bruce Rauner's administration said the problem happened under former Gov. Pat Quinn and the state plans to recover the money. The anti-violence initiative started in 2010 to curb gun violence in Chicago. But there was little oversight over the roughly $94 million in grant money. The critical audit covered the program funding through the 2014 budget year.
Auditor General [Click for full report]
Legislative Auditor calls for better oversight of state's work release programs
April 18, 2016 – WWLTV.com – The Louisiana Legislative Auditor is urging the state Department of Corrections to increase oversight of work release programs in a new report released Monday. Monday’s release comes on the 5-year anniversary of the death of inmate Jonathan Dore of a heroin overdose in a nondescript trailer behind a convenience store in Madisonville. Dore was serving part of his sentence in one of St Tammany parish's two work release programs at the time.
Legislative Auditor [Click for full report]
Watchdog: Riverview failed to adequately track staff behavior
April 8, 2016 – BDN Maine – A report released Friday by the Maine Legislature’s watchdog group found that Riverview Psychiatric Center hasn’t tracked staff behaviors that could undermine safety at the facility. It was among minor problems flagged at the hospital, which was decertified by the federal government in 2013 after violations including restraint and use of a stun gun on patients and has been struggling with staffing issues that a retired judge overseeing the state’s mental health system has called potentially dangerous.
Office of Program Evaluation and Government Accountability [Click for full report]
Audit: Veterans home did not properly investigate abuse
Feb. 19, 2016 – The Detroit News – State legislators plan to hold investigative hearings on “a troubling pattern of mismanagement” at the Grand Rapids Home for Veterans, which a new state audit finds failed to properly investigate allegations of abuse and neglect. A report from The Office of the Auditor General revealed troubling issues at the state-run veterans’ home, a partially privatized facility where a combination of state and contract workers care for more than 430 residents.
Office of the Auditor General [Click for full report]
Audit: Minnesota DNR should develop deer plan
May 28, 2016 – Duluth News Tribune – Management of white-tailed deer by the Minnesota Department of Natural Resources is sound, according to a report released Thursday by the Minnesota Office of the Legislative Auditor. But the audit also calls upon the agency to develop a formal deer management plan and improve its resources for estimating deer populations, including more field research.
Office of the Legislative Auditor [Click for full report]
PEER scrutinizes state credit card use
Jan. 19, 2016 – The Clarion-Ledger – A Performance Evaluation and Expenditure Review Committee report says the watchdog agency found 23 percent of the time, guidelines weren’t complied with in a check of purchases with state-issued procurement cards/credit cards at three state agencies. The credit cards are issued by public employees to make purchases on behalf of the state. More than 28,000 cards have been issued to state employees, according to PEER.
Joint Committee on Performance Evaluation and Expenditure Review [Click for full report]
Audit finds MSU, UM run on smaller staffs, leaner overhead
March 6, 2016 – Bozeman Daily Chronicle – Do Montana’s two biggest universities have too many employees and spend too much on administrative overhead? The answer is no, according to the state’s Legislative Audit Division. Montana’s two flagship schools “compare favorably to their peers in basic measures of administrative efficiency,” the auditors concluded.
Legislative Audit Division [Click for full report]
Charter schools cost more for similar results, report says
Jan. 18, 2016 – Albuquerque Journal – Rapidly expanding charter schools in New Mexico are spending more per student with academic results similar to those of traditional public schools, state program analysts told lawmakers on Monday. The evaluation of six selected schools out of 97 in the state by staff at New Mexico’s Legislative Finance Committee warned that charter schools are diluting the amount of funds available at all schools as charter schools continue to be authorized independently of the state’s budget process.
Legislative Finance Committee [Click for full report]
NC process in awarding private contracts found lacking
Jan. 12, 2016 – The News & Observer – North Carolina’s routine in awarding private contracts doesn’t ensure taxpayers are getting the best for their money, says a new report that is prompting a call for legislative tightening. Agencies aren’t considering all competition before awarding such contracts, aren’t documenting the reasons for using private providers, aren’t charting the results and don’t consistently word their agreements to ensure the contractor delivers a quality performance, said the 51-page report from the Program Evaluation Division.
Program Evaluation Division [Click for full report]
Audit Says Oregon Employment Department Computer Systems Should Be Replaced
Dec. 31, 2015 – Willamette Week – A new state audit says the Oregon Employment Department needs to replace the computer systems that process unemployment benefits and taxes. The audit focuses on two systems, the Oregon Benefit Information System (OBIS), which processes unemployment benefits, and the Oregon Automated Tax System (OATS), which deals with unemployment tax reports from employers. The audits division says OBIS and OATS are "inflexible, poorly documented, and difficult to maintain," and that their programming language is "outdated." Both programs have been in use since the early 1990s.
Secretary of State Audits Division [Click for full report]
Some vets face waiting lists at state-run medical facilities
May 19, 2016 – witf – There's a waiting list for beds at some state-run veterans medical facilities - but also more than 150 vacant beds at others, according to a Legislative Budget and Finance Committee report. There are six state-run veterans medical facilities in Pennsylvania, and Phil Durgan, Executive Director of the Legislative Budget and Finance Committee, says there are 163 vacant beds, and a waiting list of 99 veterans.
Legislative Budget and Finance Committee [Click for full report]
Audit criticizes how infrastructure bank shared road funds
May 27, 2016 – Independent Mail – A recent audit of a South Carolina agency offered harsh criticism and suggested it merge with the state's highway department. The State Transportation Infrastructure Bank awarded funds to projects without an application, used no formal policies to award funds and failed to publicly announce the availability of funds, according to a Legislative Audit Council report. The council's report even suggested that the agency merge with the state Department of Transportation.
Legislative Audit Council [Click for full report]
Study finds many Tennessee freshmen not prepared for college
Jan. 27, 2016 – The Commercial Appeal -- While Tennessee has reduced the percentage of its public college freshmen needing remedial help, large numbers of high school graduates still arrive inadequately prepared for college work, a report released Wednesday by the state comptroller's office concludes.
Comptroller of the Treasury, Offices of Research and Education Accountability [Click for full report]
Audit Finds DPS Overpaid Vendors
May 6, 2016 – WBAP – An audit of the Texas Department of Public Safety said the agency may be overpaying some of its vendors. The State Auditor’s Office reviewed contracts the department has with its vendors. The report said DPS overpaid U.S. Bank $500,000 to provide employees with cards to pay for gas.
State Auditor's Office [Click for full report]
Audit: State doing poor job tracking job training and education cases
May 11, 2016 – The Salt Lake Tribune – The state agency that helps disabled Utahns get job training or an education is doing a poor job of managing and tracking its cases, according to a new audit released Wednesday. The report by the Utah legislative auditor general reviewed a sample of vocational rehabilitation cases and expressed concerns about the documentation for 77 percent of those reviewed. The auditor also said there is inadequate quality control or case review within the program. As a result, auditors said it is difficult to measure the effectiveness of the program and in some cases clients may be on the program longer than they need to, increasing the cost to taxpayers.
Office of the Legislative Auditor General [Click for full report]
State audit finds DNR ignoring own rules on water pollution
June 2, 2016 – Wisconsin State Journal – Wisconsin’s water quality regulators failed to follow their own policies on enforcement against polluters more than 94 percent of the time over the last decade, the state’s nonpartisan Legislative Audit Bureau said in a report released Friday. From 2005 to 2015, there was a general decline in state Department of Natural Resources enforcement activity to protect lakes, streams and groundwater from large livestock farms, factories and sewage treatment plants that discharge liquid waste, according to the bureau’s 124-page report.
Legislative Audit Bureau [Click for full report]
Share your coverage with us! If you would like an article highlighted in our next newsletter, send a hyperlink to email@example.com.
The NLPES Executive Committee held its annual spring meeting in Jackson, Miss., the site of the upcoming Professional Development Seminar.
The committee received reports on activities of the various subcommittees, including awards, communications, and professional development. Minutes of the spring meeting have been approved and are available online.
NLPES website—Spend a few moments touring our NLPES website to learn more about NLPES and see what we do. You’ll find general information about NLPES, including our by-laws, executive committee membership and subcommittees, state contacts, awards, and information on peer reviews. We also have a training library and resources including past meeting minutes, newsletters, and more. Check it out!
NLPES listserv—The NLPES listserv is an email discussion group for NLPES members. By sending a message to firstname.lastname@example.org, you can reach all listserv subscribers simultaneously. Listserv members can query other states about evaluation work similar to their own projects, receive announcements about performance evaluation reports and job opportunities from other states, and are notified when the latest edition of this newsletter is available! To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv. See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.
Are you receiving our listserv emails? Some states’ systems block NLPES listserv emails. If you think you are not receiving our emails, please check your state’s security system and spam filters, and/or contact Brenda Erickson (email@example.com).
Legislative careers website—Know someone thinking about pursuing a career with a state legislature? Point them to the opportunities posted on NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs. Attracting young people to work as legislative staff will be increasingly important in the coming years. And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career.
NLPES’ Professional Development Resources—Visit our NLPES online training library for a variety of refresher and training materials! There are nearly two dozen resources on planning and scoping, fieldwork, writing and publication, and management topics. Most are PowerPoint slides; some are narrated; a few are webinars or podcasts. Check them out!
JLARC presentation on web reporting—In 2014, the state of Washington’s Joint Legislative Audit and Review Committee (JLARC) began issuing audit reports as web pages rather than PDF documents. Reports are now readily accessible on computers, mobile devices, and tablets. JLARC has received favorable comments on the change from legislators, legislative and executive branch staff and the public. Writing for the web requires a change in perspective, in addition to changes in technology. Visit JLARC’s website to see samples of their web reports. Contact Valerie Whitener, Audit Coordinator, JLARC, firstname.lastname@example.org, with questions.
Ask GAO Live—AskGAOLive is a 30-minute interface where GAO staff chat about a specific report and research, and answer questions that are emailed or tweeted in. Sessions are recorded and archived on the website. You can also “follow” GAOLive to receive advance notice of chat sessions. Topics include veterans and higher education, prescription drug shortages, prison overcrowding, state and local fiscal outlook, and government contracting.
Ensuring the Public Trust—What’s the most common internal performance measure for evaluation shops? How many offices tweet? What percentage of staff has fewer than 10 years of experience? How can you contact a sister office in another state? "Ensuring the Public Trust" summarizes information about legislative offices conducting program evaluations, policy analyses, and performance audits across the country.
The Working Paper is published three times a year by the National Legislative Program Evaluation Society, a staff section of the National Conference of State Legislatures. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
The Working Paper is produced by the NLPES Communications Subcommittee.
Dale Carlson, 2015-2016 chair (CA)
Shunti Taylor, newsletter editor (GA)
NCSL Liaison to NLPES
Brenda Erickson, 303-856-1391
NCSL Denver Office • 303-364-7700