The Working Paper is the official newsletter of the National Legislative Program Evaluation Society. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Lisa Kieffer (Georgia)
As I write this, I have to remind myself that it is actually spring, even though the temperature gauge in my car said 34 degrees on my way to work this morning. I know some of the rest of you have had a similar spring—or a lot colder! But while the weather may not reflect the springiness of the season, your Executive Committee (EC) is echoing its spirit, bursting with energy and taking on new projects.
We met in Raleigh in April to preview the location for the 2014 Professional Development Seminar (PDS). Our North Carolina hosts are doing a great job of planning sessions that will be interesting and interactive. The PDS will be held in the heart of downtown, with plenty of options for networking with the folks you will meet during your stay. As planning continues, North Carolina and the Executive Committee will be seeking your input and assistance regarding individual sessions; please be an active participant in this process. Your ideas and participation are what make the PDS successful.
Our spring EC meeting marks the half-way point in our NLPES year. Because of that, we looked back to see what we needed to do to wrap up some current endeavors, and forward to identify new areas that require our focus.
We continue to work on providing relevant, quality training through our website. Currently, we are hard at work on several new additions to our training portal. Florida’s OPPAGA has kindly offered content on developing evaluation plans, which will be added to our Training Products Matrix webpage under Planning and Scoping soon—check it out. And our Professional Development Subcommittee is also developing a webinar on scoping and planning, so keep an eye out for that as well. Finally, Stan Stenerson’s writing webinar is available on our website again. As some of you already know, this training is a valuable resource, providing a framework for drafting reports and sending clear and concise messages. We will keep you updated as additional content is added to our training matrix webpage.
Another opportunity for training and networking is NCSL’s Legislative Summit. Hosted by Minnesota this year, the Summit will be held from Aug. 19-22 in Minneapolis, Minn. The Executive Committee is exploring opportunities for sponsoring or co-sponsoring several sessions targeted to NLPES’ interests and needs. Stay tuned for more information on that topic.
We’re also in awards season. Please consider applying for one or more of the NLPES awards: Certificate of Impact, Research Methods, and Excellence in Evaluation. Our offices produce important and influential products—often using unique and inventive research methodologies—that change and shape the way our state governments operate. Share your successes with the award committee!
Finally, the Executive Committee is taking on some new projects, such as looking at improving our peer review services. If you’ve participated in an NLPES Peer Review, you know how valuable they are—for both the entity being reviewed and the reviewers. Serving as a reviewer offers an unparalleled opportunity for professional development; you work with peers to delve into the inner workings of a member office and understand how they have interpreted and implemented the standards they seek to employ. As evaluators and performance auditors, you know about assessing effectiveness and efficiency of operations; this is what the Executive Committee will be working on in relation to providing peer reviews. Specifically, we will seek to streamline some administrative processes for peer reviews while maintaining the flexibility and uniqueness that makes an NLPES Peer Review so effective.
That’s what we’ve been up to since I wrote to you last! Thank you for your continued interest and involvement in NLPES and I look forward to seeing you at the PDS in Raleigh, N.C., Oct. 5-8, 2014.
Lisa Kieffer is the NLPES Executive Committee Chair for 2013–2014. She can be reached at Kiefferl@audits.ga.gov.
Angus Maciver (Montana)
Happy Spring, auditors and evaluators! Welcome to another exciting edition of Report Radar, your source for the latest and greatest in audit and evaluation reporting from our nation’s state legislatures. Following our last edition of The Working Paper, we received some rather pointed criticism about the lack of humor in the Report Radar column. We have therefore ensured this edition contains numerous opportunities for witty commentary by including sections on state Medicaid programs, education, crime and punishment, disaster and emergency management, and life and death. Might this be too much fun? Well, maybe, but let’s continue anyway.
Medicaid is a perennial favorite for Report Radar. In this edition we highlight no fewer than six reports addressing different aspects of the program: In November of last year, Virginia released a report assessing the impacts of Medicaid rates on access to health care, including provider payment policies and how these affect access to services for Medicaid enrollees. In January of this year, two states released reports looking at Medicaid costs: Connecticut focused on how use of hospital emergency rooms impacts Medicaid budgets, and Hawaii compared its Medicaid costs to other states’ and identified issues related to fraud detection and enforcement activities. Medicaid program integrity was also the focus of another January report, from Florida, which addressed recovery of overpayments for managed care plans. Implementation of the federal Affordable Care Act (ACA) is also attracting renewed focus on state Medicaid programs: Arkansas released a report in January taking an initial look at the state’s expansion of Medicaid under the ACA; and Maryland reported in April on preparations made for the launch of the state’s exchange established as part of the ACA. As more of us are tasked with reviewing state exchanges or Medicaid expansions under ACA, both these reports could be useful starting points.
Turning now to education, we have a number of recent reports addressing a variety of issues. Mississippi released a report on the adoption and implementation of the new common core standards (something more of us may be dealing with soon). Colorado addressed school meal programs; Pennsylvania reported on special education services for gifted students; and Tennessee published its findings on the effects of extended learning times (longer school days versus more school days). If you are interested in a broader look at educational administration and funding, you would do well to read North Carolina’s April report identifying cost savings in administrative program monitoring. You could also read Utah’s February report reviewing best practices in the administration and management of the state’s school districts.
Report Radar may not be as readable as Dostoyevsky, but we can still talk Crime and Punishment with the best of them. For example, in March Idaho released a report that assessed the financial costs of the death penalty, and in February Louisiana reported on the oversight of capital cases by public defenders. Other reports on the justice system that might interest you include a February release from Minnesota, which looked at the delivery and cost of health care in state correctional facilities, and a November report from New Hampshire addressing transitional housing and work release programs for the recently incarcerated.
Government responses to disasters and emergencies are often subject to retrospective scrutiny. Our member offices are frequently called on to audit or evaluate agencies with responsibilities in this area. Report Radar highlights three examples of this work, the first of which comes from our friends in Alaska, who reported in October on the state’s land mobile radio communications system. Interoperable communications have been a big focus of improvements in disaster response capabilities, and this report will be of interest to any of you planning work in this area. Other offerings come from Missouri, which released a report in December on the state’s efforts to improve training on handling incidents involving hazardous materials, and Nevada, which published a January performance audit of the state’s Division of Emergency Management.
Let us now turn our attention to issues of life and death—specifically, the systems and processes states use to record the only two events none of us can avoid. In the two minutes you have just spent reading Report Radar, approximately 15 new Americans entered the world, and approximately 10 departed. Each of these events are supposed to be recorded in one way or another in state-administered systems and officially certified. If you want to know how these systems work (or don’t work), you had better read January reports from both Montana and Michigan, which address issues relating to the management of electronic birth and death records and the information systems used to maintain these data.
That’s it for this edition. Report Radar hopes you had fun (but not too much), reading all about these wonderful audits and evaluations. Keep up the good work and see you next time.
Angus Maciver is the Deputy Legislative Auditor for the Montana Legislative Audit Division. He can be complimented or heckled (your choice) at email@example.com.
Easy Enough, Right? Finding a Good Fit at the Joint Committee on Performance Evaluation and Expenditure Review (PEER)
Max Arinder and James Barber (Mississippi)
Following our recent reorganization, in which we shifted three senior staffers to a new performance budgeting unit and filled a vacant position, we welcomed on board four new analysts.
MeriClare Steelman came to us with a BS in business administration and an MBA with an emphasis in accounting; Jenell Ward has a BS in nutrition and a MPH with an emphasis in health policy and administration; Ray Wright comes to us with a BSBA in accounting and an MBA with an emphasis in finance; and Sarah Williamson has a BA in English and philosophy and a JD. Only one of the four had state government experience—and that was not in auditing or evaluation.
To help us with future recruitments, we decided to learn whether we had accurately portrayed in the interview process an analyst’s duties and the qualities needed to work at PEER. To do so, we asked our newbies about their initial expectations of the job, what they have learned from a “reality check” of a few months on board, and what intangible qualities they now know are needed to work as a PEER analyst.
Initial expectations: Two of the four had expected an analyst would need a wide range of skills, including analytic and communication skills. One expected a “fluid work environment” and one expected that education and work experience would be the major influence in that person’s work on PEER projects.
Post-“reality check”: After a few months with us, two analysts said that the duties and skills of the position were similar to their initial expectations. One said that although education and experience were good starting points, the skills developed along the way in academia and in previous positions were more critical to success as an analyst. One said, “When I was first hired, I did not understand the importance of the work that I do at PEER. Realizing how much the Legislature relies on our information and the shape that work has on Mississippi policy has been a very rewarding realization for me.”
Advice hereafter: Based on their first few months, our new analysts reported that the intangible qualities needed to work as PEER analysts are:
This exercise was indeed reflective for our new analysts, but it really opened our eyes as managers about what we can add to the interview process to help ensure incoming staff have a realistic expectation of the job. As one analyst said regarding their initial expectations: “I knew I needed to be able to digest and relay information, feel comfortable with public speaking, and know how to turn on and operate a computer…easy enough, right?”
New Hampshire Office of the Legislative Budget Assistant, Audit Division
Stephen Fox (New Hampshire)
New Hampshire’s first Legislative Budget Assistant was appointed in 1947 to analyze the state’s financial condition and provide information to the Legislature. In 1969, the LBA’s duties were expanded to include post audits of the accounts and records of any State department, board, institution, commission, or agency. Authority to conduct financial and compliance audits of federally funded programs was granted in 1983, and in 1987 the Audit Division was formally established, with authority to conduct program results audits (performance audits).
Oversight of LBA: Performance audits are chosen by the joint Legislative Performance Audit and Oversight Committee (LPAOC), which consists of five senators and five representatives. Audit topics and final reports are approved by the joint Legislative Fiscal Committee (five senators, five representatives) before their release.
Audit types: We perform financial and performance audits according to GAO’s Government Auditing Standards and contract with independent CPA firms to conduct the state’s Comprehensive Annual Financial Report (CAFR), Single Audit of Federal Financial Assistance Programs, and 529 College Tuition Savings Plan. Our recent financial audits include the State Treasury, Turnpike System, and Lottery Commission. Our recent performance audits include the State Veterans’ Home, Community Development Finance Authority, Community Corrections, and Electronic Benefit Transfers. (Reports are available on our website.)
Staff: We have 25 staff, including the director, two audit supervisors, 21 audit staff (12 financial auditors and nine performance auditors), and one administrative assistant. Our financial audit staff have accounting backgrounds while our performance audit staff have public administration and social science backgrounds. Despite our state’s small size—we’re 46th in area, 42nd in population—we manage to attract staff from all over the country. We’ve had staff from Colorado, Pennsylvania, Virginia, Texas, Minnesota, Missouri, and Illinois as well as Japan, Mexico, and Canada. Not all our transplants stay; some go home, while some go even farther away, like to Washington or New Mexico!
What makes the Granite State special?
Natural beauty: New Hampshire has only about 13 to 18 miles of coastline (reports differ), but that’s more than our neighboring state, Vermont! Mt. Washington, which stands at 6,288 feet, is not the tallest mountain in the
Appalachian chain, but from 1934–2010 it held the record for the highest recorded wind speed in the world, at 231 mph (it took a 254 mph cyclone in the south Pacific Ocean to beat this record). New Hampshire has 48 peaks above 4,000 feet, and “peak baggers” aim to climb all of them—sometimes more than once, or in unique ways, like all in winter, or all in one month. And of course, we have a lot of granite!
Elective representation to the max: The General Court, as the state’s Senate and House of Representatives are collectively known, is the third largest elective body in the English-speaking world. At 424 members, it is smaller than only the U.S. Congress and the British Parliament. Members receive a yearly salary of $100; presiding members receive $125 annually. And while the General Court is large, we have only two federal representatives. New Hampshire is currently the only state whose entire congressional delegation (two representatives and two senators) are female. The Governor and the Speaker of the House are also female.
Did you know?
Using Spans and Layers to Measure the Efficiency of Higher Education Support Functions
Drew Dickinson (Virginia)
The Joint Legislative Audit and Review Commission (JLARC) is currently reviewing the cost efficiency of Virginia’s public four-year higher education institutions through a series of studies. Our review of support functions—e.g., human resources, information technology, and procurement—at Virginia institutions will be released in October 2014.
A number of existing studies have found that higher education institutions could streamline their organizational structures to improve their support functions. Many of these studies used spans and layers analyses to assess the efficiency of their organizational structures. “Spans and layers” refers to the average span of control (i.e., number of employees who directly report to a single supervisor, known as “direct reports”) and the number of layers in an organization.
Spans and layers can have a substantial impact on efficiency. When managers have too few direct reports, an organization’s structure is compressed vertically—tall and narrow, with too many management layers. Processes tend to be overly redundant and too much time is spent on supervisory activities, such as communicating up and down a management chain. Tall, narrow organizations tend to have an excess number of employees with management-level salaries, which can increase costs and decrease efficiency.
On the other hand, an organization with fewer layers and wider spans emphasizes service delivery and high-value, productive activities (e.g., customer service or transaction processing) over lower-value supervisory activities (like requiring several approvals to act). A flat, wide structure offers front-line employees the opportunity to focus on high-value activities, which promotes more effective and efficient service delivery.
What we will do: We plan to perform span and layers analyses for Virginia’s 15 public four-year institutions to assess whether their support functions are organized efficiently.
Using organizational charts to manually identify spans of control and number of layers is one way to perform these analyses; however, we ruled out this method because Virginia’s institutions are large organizations, and this effort would be very time consuming.
Instead, we will collect data from each institution’s human resources department, including each employee’s unique identifier, position title, and direct supervisor, as well as his or her salary and benefits information.
Span of control—We will identify, using Excel pivot tables or SAS frequency counts, each manager’s span of control by counting the frequency with which each employee appears in another employee’s record as a direct supervisor.
Number of layers—This aspect is a bit more challenging. We will use SAS to identify each institution or department’s top manager (this is the first layer), then identify the number of employees that manager supervises. We will then determine which of those employees are also managers (second layer), and so on, until all layers have been identified. Finally, we will use managers’ salary and benefit information to determine the cost of each layer.
Interpreting the results: There is no one-size-fits-all answer to an organization’s correct span of control or number of layers. The appropriate numbers of spans and layers depend on a variety of factors, such as the risk and complexity of a function, geographic proximity of employees to a supervisor, and size of organization. The consulting firm that performed the majority of studies we reviewed recommends higher education institutions have a minimum of 6–7 direct reports per manager for expertise-based functions (e.g., human resources, information technology, and procurement services) and 11–13 direct reports for task-based functions (e.g., maintenance, grounds keeping, and custodial services).
Because there is no single right answer for the ideal number of spans and layers, we will compare Virginia’s institutions with benchmarks from existing studies and other literature to determine whether our institutions are organized efficiently. Each institution will be compared to other Virginia institutions with similar workloads, such as those within the same broad Carnegie classification (meaning whether it awards doctoral, master’s, and/or baccalaureate degrees). We will perform additional research on institutions that have either low average spans of control or high numbers of layers relative to other Virginia institutions or benchmarks. Such institutions are the most likely to have opportunities to streamline their organizational structures to improve their efficiency and effectiveness.
Drew Dickinson is a senior associate legislative analyst with Virginia’s Joint Legislative Audit and Review Commission (JLARC). He can be reached at firstname.lastname@example.org.
Executive Committee Elections
Karl Spock (Texas)
The annual election to fill four expiring terms and complete the NLPES Executive Committee’s roster for 2014–2015 is over! Executive Committee members each serve for three years, and terms are staggered so that four seats are up for election in any given year. Our election cycle begins in February, with a membership-wide call for nominations. Candidates submit biographical statements of no more than 200 words, which are included with the ballot and sent to NLPES offices across the nation. Ballots are returned by mid-April to our NLPES liaison at NCSL, who tallies the votes.
As elections chair, I'm pleased to announce this year’s four seats will be filled by Wayne Kidd (Office of the Legislative Auditor General, Utah), Katrin Osterhaus (Legislative Division of Post Audit, Kansas), Marcia Lindsay (Legislative Audit Council, South Carolina), and Linda Triplett (Joint Committee on Performance Evaluation and Expenditure Review, Mississippi). Congratulations to all!
The complete 2014–2015 Executive Committee also includes Dale Carlson (State Auditor's Office, California), Greg Fugate (Office of the State Auditor, Colorado), Rachel Hibbard (Office of the Auditor, Hawai‘i), Lisa Kieffer (Department of Audits and Accounts, Georgia), Angus Maciver (Office of the Legislative Auditor, Montana), Nathalie Molliet-Ribet (Joint Legislative Audit and Review Commission, Virginia), and Charles Sallee (Legislative Finance Committee, New Mexico).
Have a good year, all!
Karl Spock is the Executive Committee’s immediate past chair and current chair of the Elections Subcommittee. He can be contacted at email@example.com.
Marcia Lindsay (South Carolina)
It’s time to prepare your submissions for the 2014 NLPES awards! The deadline to apply for an award is Friday, May 9, 2014. Please visit the NLPES website to review the descriptions and guidelines—and consider applying!—for the following awards:
Each award has three judges; remember to provide your submission to all three judges. All of the judges’ information is listed on the NLPES website. If you have any questions about the application process, please contact Marcia Lindsay at firstname.lastname@example.org or (803) 253-7612.
Jim Nobles was appointed in November 2013 to his sixth six-year term as Minnesota's Legislative Auditor. Jim has worked for the Minnesota Legislature since 1972 and has been Legislative Auditor since 1983.
Please let us know if you have staff happenings to share! Email email@example.com
NLPES website—Learn more about NLPES and see what we do by spending a few moments touring our NLPES website. You’ll find general information about the NLPES, including by-laws, executive committee membership and subcommittees, state contacts, awards, and information on peer review. We also have a training library and resources including past meeting minutes, newsletters, and more. Check out our website resources!
NLPES listserv—The email address for the NLPES listserv has changed. The new address is firstname.lastname@example.org. If you keep an email address book, please update it with this address.
Not already a subscriber? The NLPES listserve is an email discussion group for NLPES members that allows you to send a message to all NLPES listserv subscribers simultaneously. You and your office can participate by posting items such as announcing report publications and posing questions regarding audits, whether practical or philosophical in nature.
Join our NLPES listserv and:
To join the listserv, send an email to Brenda Erickson, the NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include in your email your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv.
See the listserv link on the NLPES website for additional information on how to post messages to the listserv and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv. You’ll be glad you joined!
Legislative careers website—Know a young professional thinking about pursuing a career with a state legislature? Point them to the opportunities posted NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs. Launched by NCSL in June 2012, this is a great website. According to NCSL, attracting young people to work as legislative staff will be increasingly important in the coming years. And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career
NLPES Online Training Library
New: Developing Evaluation Plans
As sessions adjourn, many of us will start to plan new projects. A key part of that process is laying your ideas out on paper. Developing Evaluation Plans describes how to develop a plan to incorporate your research questions/ issues, methods, staff assignments, deadlines, and communication strategies into one document that will help your team stay on task and on time.
For more refresher and training materials, visit our NLPES online training library, where there is a wealth of resources on critical thinking, finding savings, interviews, quantitative methods, sampling, survey development, reviewing contracts, effective presentations, report writing and various management topics.
Legislative Summit 2014—will be held on Aug. 19-22 in Minneapolis, Minn. The NLPES Executive Committee will meet during the Summit. For more information on the Summit, contact email@example.com.
2014 Professional Development Seminar—will be held on Oct. 5-8, 2014 at the Sheraton Raleigh Hotel in Raleigh, N.C. in conjunction with NLSSA. For more information, contact Brenda Erickson.
The Working Paper is published three times a year by the National Legislative Program Evaluation Society, a staff section of the National Conference of State Legislatures. NLPES serves the professionals of state legislative agencies engaged in government program evaluation.The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Visit the NLPES website
2013–2014 NLPES Communications Subcommittee:
Dale Carlson (CA)
Charles Sallee (NM)
Rachel Hibbard, newsletter editor (HI)
NCSL Liaison to NLPES:
Brenda Erickson, (303) 856-1391
NCSL Denver Office • (303) 364-7700