The Working Paper

 

chair's corner

Picture of NLPES Chair Wayne KiddChair's Corner | Wayne Kidd (Utah)

Greetings, auditors and evaluators!  It is a new year for NLPES and autumn cheered us on with scarlet, orange, and gold leaves by the thousands. With the rustling of leaves, there was also some rustling on the executive committee. In October, the committee’s officers changed; I will be serving as chair, Nathalie Molliet-Ribet (Virginia) is the new vice chair, and Marcia Lindsay (South Carolina) is the new secretary.

Within the executive committee, the Professional Development Subcommittee will be headed by Katrin Osterhaus (Kansas) this year. Nathalie Molliet-Ribet will chair the Awards Subcommittee, and Dale Carlson (California) will continue to chair the Communications Subcommittee. Lisa Kieffer (Georgia) will chair the Peer Review Subcommittee and also serves as the immediate past chair. We would also like to extend a warm welcome to Linda Triplett (Mississippi), our newest executive committee member.  We and the other executive committee members look forward to serving you, our NLPES membership.

Autumn’s colorful celebration was not the only event taking place in October. NLPES held its annual Professional Development Seminar (PDS) in Raleigh, North Carolina.  We had 115 registered members from 27 states attend the event.  It was an excellent PDS with informative sessions, insightful discussions, and lots of networking. The executive committee would like to thank the North Carolina General Assembly’s Division of Program Evaluation for hosting the PDS, and especially Carol Shaw for her hard work.

Looking ahead to the upcoming year, the Professional Development Subcommittee broadcast a webinar by the Florida Legislature’s Office of Program Policy Analysis and Government Accountability on Excel pivot tables, on December 4, 2014—if you missed it, you can access the archived version on our website.  The subcommittee will also be busy helping plan the PDS for 2015, to be held in Denver, Colo. We appreciate the Colorado Office of the State Auditor for hosting! We don’t have an exact date yet, but as soon as we do, we will post it on the NLPES website. We hope to see you in Denver!

As for the other subcommittees, the Awards Subcommittee is beginning to look for judges to help review submissions and select winners for next year’s awards. If you are interested in being a judge, please contact Nathalie. Also, please consider applying for an award next year; it is not too early to begin thinking about which report(s) you want to submit. The Communications Subcommittee continues to produce this stellar newsletter, The Working Paper, three times year; please contribute to the newsletter. Plus, the subcommittee stays busy overseeing the NLPES listserv and maintaining the accuracy and currency of the NLPES website. The Peer Review Subcommittee will help to promote the benefits of peer review and help with the peer review process.  If you have ideas or input in any of these areas, please feel free to contact the subcommittee chairs.

Please take advantage of the resources that NLPES has to offer and the opportunities to get more involved with NLPES.  It is an incredible organization and we are fortunate to be able to associate with outstanding colleagues who are dedicated and want to add value to the legislative process. I am proud to be a part of it!

Wayne Kidd is the NLPES Executive Committee chair for 2014–2015.  He can be reached at wkidd@le.utah.gov.

 

features


Report Radar | Ancus Maciver (Montana)

Seasonal greetings and welcome to another action-packed edition of Report Radar, the world’s number one resource for audit and    evaluation professionals! While everybody else has been busy shopping/eating their way through the holiday season, the dedicated staff of Report Radar has been trawling the nation’s state legislatures to bring you the latest and greatest in audit and evaluation reporting. This edition groups reports into:  reduce/reuse/recycle, long-term care for the elderly, law and order, information technology, and (the always popular) very interesting.

Reduce, reuse, recycle:  Programs and policies designed to encourage recycling are a common feature in many of our states, and who doesn’t want to reduce/reuse/recycle more? We begin with a November report from our friends in California, which addresses operations of the state’s beverage container recycling program. As with some other recycling programs, this report highlights concerns over financial sustainability, as well as issues relating to fraudulent activity in the program. Recycling vehicle tires is another environmentally-friendly program offered in some states, and we have two reports to share that address this issue.  In June, Colorado released a report on the Waste Tire Processor and End User Program, which discusses efforts to reduce the state’s stockpile of over 60 million waste tires. Similarly, Louisiana released a report in July addressing management of its Waste Tire Management Program that identified issues relating to fee collections for waste tire disposal.  We also have inside information that Hawai‘i will soon be releasing a report on glass recycling; look for it on their website.

Long-term elder care:  If your office has been asked to look at long-term care for the elderly, you are not alone. We have four reports to highlight from member offices that have recently addressed different aspects of this issue.  In May, New Mexico released a report on the Aging and Long-Term Services Department, which addressed resource allocation, cost, availability, and effectiveness of the state’s area agencies on aging. Illinois released a report in July focusing on efforts to expedite eligibility determination and enrollment in Medicaid-funded long-term care facilities.  We also have two reports from October addressing other issues relating to long-term care for the elderly: North Carolina has an evaluation of a new adult day care program designed to provide oversight respite for caregivers, and California looked at the management of investigations into complaints about long-term care facilities.  Each of these fascinating reports addresses a different aspect of an issue that is only likely to become more important in our aging society, so get reading.

Law and order:  First, we have a couple of reports relating to police agencies or services. In September, our people in Pennsylvania took a look at the potential impacts from consolidation of municipal police departments across the state. We also recommend an October report from Maryland, which addressed the Department of State Police Aviation Command, including the use of police helicopters for emergency medical transportation, aerial law enforcement, search and rescue, homeland security, and disaster assessment services. In December, Nevada released a report on the fiscal costs of the death penalty. Alert readers of Report Radar will remember a similar report from Idaho earlier this year; both these reviews could prove useful if your office is asked to look at capital punishment issues.

Information technology:  The fast-changing landscape of Information Technology in state government is our next subject. We begin with those fine people in Kansas and their October report on an IT modernization project for the state’s Department of Motor Vehicles; many states are planning for or in the process of upgrading IT systems used in vehicle title and registration and driver licensing functions, so if you are interested, pick up this valuable report. Another area of interest in the IT arena is efforts at warehousing and sharing some of the large data sets typically maintained by different agencies; if this is where things are going in your state, you should read a report released in August by Michigan, which looked at data reliability and access control for the state’s multi-agency enterprise data warehouse. Yet another area of frequent inquiry is the governance models used in decision-making and managing IT resources; a very satisfying report from Virginia in September provides a great overview of this subject.

Very interesting:  Last, but almost certainly not least, is our collection of miscellaneous reports from around our member offices. Minnesota released a report in October looking at the state’s use of federal funding to implement a health insurance exchange under the Affordable Care Act; good reading for those of us anticipating future work in this area. Connecticut made an initial report in October on regulation of drone or unmanned aerial vehicles. And finally, next time you are feeling sorry for yourself (or somebody else) because of the sheer number of recommendations your report contained, have a look at the October report from Georgia on the state’s Government Transparency and Campaign Finance Commission, which contained 42 recommendations; this is an admirably thorough and detailed piece of work and very worthy of your time.

That’s all folks. All the staff at Report Radar extend their warmest seasonal and/or holiday greetings. Stop by in the New Year for another serving of audit and evaluation excellence.

Angus Maciver is the deputy legislative auditor for the Montana Legislative Audit Division. He can be reached at amaciver@mt.gov.


State Profile:  Mississippi Joint Legislative Committee on Performance Evaluation and Expenditure Review (PEER) | James Barber (Mississippi)

PEER is a standing committee of the Mississippi Legislature composed of 14 members, seven from each house of the Mississippi Legislature. Created by statute in 1973, it was one of the first entities of its type in the nation.

Because Mississippi does not have partisan or research staff, PEER’s work extends beyond traditional legislative evaluation. For 41 years, PEER has provided short-term informational assistance as well as in-depth evaluations of government operations to support the function of legislative oversight.

The PEER Committee currently employs 25 staff, the majority of whom are classified as Analysts. PEER has five main activities:

  • Conducting audits and evaluations. PEER has statutory authority to review any state or local entity that receives public funds, including contractors.  PEER’s reviews may have multiple objectives and can be one of many formats, such as descriptive summary, investigation, compliance review, management review, economy and efficiency review, program evaluation, or policy analysis. The committee publishes results of these reviews in reports that are distributed to the Legislature and the public. PEER has released 586 formal reports since its inception.
  • Responding to legislative requests. By law and committee rules, PEER provides assistance to any legislator or legislative committee upon request. These may range from simple informational requests to complex direct assistance for committees or subcommittees. Legislative assistance work products include memoranda, bill drafts, bill summaries, fiscal notes, briefings or presentations, facility inspections, maps, public hearings, consultations with state agencies, and administrative assistance to task forces or study committees.
  • Performing background investigations on gubernatorial and other appointees. The state’s Senate has the power to confirm the appointments of executive board members and, in some cases, agency directors. Although not required to do so by statute, Senate committee chairs routinely ask PEER to conduct background investigations of appointees to assess individuals’ qualifications and general fitness for office.
  • Assisting with Mississippi’s performance budgeting effort.  In 2011, legislative leadership asked PEER to help the Legislature revitalize performance budgeting and thereby help ensure public dollars are efficiently expended on programs and activities that are proven to achieve desired outcomes.  PEER has also been involved with the Pew-MacArthur Results First Initiative, a cost-benefit and evidence-based method for analyzing state government spending.
  • Performing audits and inspections of the state’s correctional system. PEER Committee staff includes the state’s Corrections Auditor, who is required by statute to audit accounts of the Department of Corrections, report on the letting of bids by the department, and make a periodic inventory of departmental assets.  The Corrections Auditor provides monthly reports to legislative leadership and assists with corrections-related legislative assistance and projects.

Since its creation, PEER has been integrally involved with NCSL and NLPES. John Turcotte, a former PEER Executive Director, was a charter member of the then-Legislative Program Evaluation Society and assisted in drafting the bylaws. Three current or former PEER staff—Max Arinder (current director), John Turcotte (now the Director of North Carolina’s Program Evaluation Division), and Steve Miller (now Chief of the Wisconsin Legislative Reference Bureau)—have served as NCSL Staff Chair. PEER staff have also served on the NLPES Executive Committee and as attendees or panelists for NLPES professional development seminars.

Now for the really important information about the PEER Committee staff!

  • Best answer to the question “What does PEER do?”: “We answer complex questions in ten words or maybe less.”  (count ‘em)
  • Most interesting backhanded compliment ever received: Those folks don’t mind amessin’ [not a literal translation] with people!”
  • Phrase often heard from legislators: “I know you’re busy and it’s five minutes ‘til five, but…”
  • Phrase most welcomed by staff: “By a quorum vote of the Committee, the report is approved for release.”
  • Phrase most dreaded by staff: “The editor wants to see you.”
  • Most honest comment at an entrance conference: “Look, I know you’re not glad to be here and you’re certainly not here to help, so just cut to the chase.”
  • Most honest comment at an exit conference: “All I want to do is finish and get the hell out of here.”
  • Most chilling agency comment at an exit conference:  What a relief!  You didn’t find half of what I thought you would…”
  • Most disconcerting evaluation experience: Being locked in the gas chamber at the state prison.  (They were just joking around…we don’t even think there was cyanide in the canister.)
  • A top 10 evaluation finding:  When reviewing meal receipts, finding that someone had requisitioned the time of day printed on the receipt rather than the amount of the meal — and got paid for it.  ($12.15 to be exact.)
  • Best incoming phone call:  “I’m doing a research report and would like to request some information on peer pressure.”
  • Most frequent wrong number call received by PEER: Many, many calls for PERS—the Public Employees Retirement System (..maybe these people are trying to tell us something…

We love our work.  We love our Legislature.  And we love being a part of the legislative program evaluation community of this great country.  Y’all come see us!

James Barber is the deputy director for PEER and former editor of this newsletter.  He can be reached at james.barber@peer.ms.gov.


Research and Methodology: Using GIS to Enhance Evaluation Reports | Ian Green (Oregon)

Last year, the Oregon Audits Division began expanding the use of Geographic Information System (GIS) software—a tool used to create maps—to improve the quality of our reports.

Visuals help engage readers In our modern world, policy makers face a steady barrage of reports from multiple sources.  No one has enough time to read every page that passes by his or her desk. The reality is that legislators and their staff skim reports. As program evaluators, we should be aware of this fact.

We must strive to capture the attention of our audience. The use of striking visuals is a great way to do so. A wonderful visual can engage readers, signaling them to read the accompanying text of the report.

An important caveat: the quality of the text should match the quality of the visual. You can have the best graphic in the world, but if the text is dense and lifeless, you will lose the reader.

Maps are a great example of a striking visual. Everyone can easily relate to and interpret most maps. Not only are maps easy for people to understand, but legislators often focus on issues in the districts they serve. By creating maps, you can effectively engage policy makers with the information they want.

Maps can tell stories The adage goes, “a picture is worth a thousand words,” but maps can also tell a persuasive story. Figure 1 is an example from our audit of Oregon’s Temporary Assistance for Needy Families’ program.

The program is very sensitive to economic downturns. When millions of people lost their jobs during the great recession, they sought public assistance in record numbers. That should not come as a surprise, since helping people during tough times is the raison d’ĂȘtre of these safety net programs.

By highlighting unemployment rates, we reminded our readers of the economic conditions of the time. This important context was essential for our readers to consider when evaluating our findings.

Maps can effectively highlight issues One of the best benefits maps provide is the opportunity to spot problems that would be invisible without a geographic perspective.

Image of 1854 map of LondonBelow is a classic map.  It is 1854 and London is facing a daunting epidemic of cholera. At the time, no one knew how the disease spread. Day by day, more Londoners succumbed to the disease.

John Snow, a local physician, set out to identify the cause of the outbreak. John believed that once the cause was identified the city would be able to develop a plan of action to stop the outbreak.

John examined the relationship between the location of individuals who had died and the location of their water source. It was clear from his analysis that the Broad Street Pump in the center of the map was the source of the outbreak. He convinced local officials to disable the pump based on this information and helped quell the epidemic.

Fraud detection Recently GIS has found applications in fraud detection. GIS supplements other fraud tools by looking at fraud from a spatial point of view.

Consider for example the Medicaid program: The program has both clients and providers. It may be suspicious if you find a provider who is serving multiple clients who live hundreds of miles away. By looking at average driving distance per provider, you might uncover some fraud that would otherwise be missed. This concept can extend to any program where you can analyze some geographic component.

Free tools galore  A quick internet search for “open source GIS” will yield multiple products to try. QGIS and GRASS GIS are two great open source tools.

Google also has some easy to implement interactive web maps. The U.S. Census Bureau is also an excellent source to find data to map. Recently, the Census has put together an easy to use Census Data Mapper.

Sometimes you need to pay the price  For some, the options mentioned above will meet your needs.For others, you may have to open your checkbook to get the right tool.

The simplest to use, and one of the least expensive, commercial options is SocialExplorer. It takes only a few seconds to create beautiful maps using the intuitive interface. SocialExplorer even has a free version with limited functionality. One of the drawbacks to SocialExplorer is that you cannot import your own data and are limited to mostly census surveys. It also lacks some customizable features found in other programs. If you want to start somewhere on your path to GIS, I recommend starting here.

ArcGIS and Tableau are two other power players in the data visualization market. The features and analytical capabilities of their software, and in particular ArcGIS, are amazing. The drawback is the cost of licenses and the amount of staff time you need to invest to learn the software. However, the benefits can far outweigh the costs when you leverage the tool properly.

Good luck on your journey with GIS!

Ian Green is a senior auditor with the Oregon Secretary of State. He can be reached at ian.m.green@state.or.us.


Staff Profile:  Meghan Westmoreland (Colorado)

I had heard about the Colorado Office of the State Auditor (OSA) as I was finishing up my final semester of college at the University of Colorado in Boulder.  I was always interested in public policy and the legislative process, but was focused on my degree in International Affairs and economic development in Africa.  After spending a semester abroad in Senegal, a French speaking country in West Africa, and learning about public health, economic development, and the environment and culture, I decided to pursue a career in international non-profit work.

However, 2009 was not a great year to be a recent graduate!  In the meantime, I moved to Huntsville, Ala., home of many of the federal government’s defense contractors and NASA programs.  While in Huntsville I worked at a variety of jobs—everything from teaching LSAT prep courses to Starbucks.  I learned a lot in my time at Starbucks about managing people, running a business efficiently, and tackling a constant variety of problems, like 4 a.m. phone calls about broken pipes or running out of espresso.  In the midst of daily chaos, I focused on “What can we do to make our work more efficient and outcomes for customers even better?”  Apparently I was auditing before I even realized it!  I also had the chance to plan several community service events for our employees and customers, including a food bank drive, yard cleanups for the elderly, and cooking meals for families with extended stays in a nearby hospital.  I enjoyed my job, its challenges, and being involved in the community but I knew I wanted a career where I could have a greater impact on the community.

Then my sister emailed me about an internship opportunity at OSA.  So, four years after hearing about the OSA, I applied for a job as a performance auditor.

What initially drew me to the position was the accountability and oversight the office provides to the public for Colorado’s government agencies and programs.  At the OSA we get to look at the facts, numbers, and laws and determine from an objective standpoint what can help the agency or program function in a more efficient and equitable way.

After working here for over a year, what is most rewarding about performance auditing is that it gives me the opportunity to delve into a variety of programs and state issues for a brief period of time.  I have always had a lot of varying interests in subjects from public health to the environment, to economic and social issues; it was hard to choose which path I wanted to follow, but at the OSA I get to do it all.  I have been able to work on auditing human services programs, health care programs, and even a program for victims of crime.  I also enjoy that the type of work is constantly changing, whether it is data analytics, agency site visits, interviewing staff, writing and editing, or giving public presentations to legislators; the work is always changing and I have the opportunity to learn and practice new skills.

Meghan Westmoreland is a legislative performance auditor with the Colorado Office of the State Auditor.


Lessons Learned-The Auditor's Comments | Florida Office of the Auditor General (September 1992)

[Editor's Note:  This article is the second in an occasional series of articles expertly culled from past issues of the NLPES newsletter.  We trust you will find the advice herein equally as valuable as it was upon first publication.]

The problem:  What could have been done to reduce the possibility that the following might occur?

After issuing a performance audit report, an agency employee called and said that comments made to the auditor during the audit about organization operations and upper management had, in the employee’s opinion, resulted in the employee being demoted.  The auditor felt terrible and wondered if there was anything that could have been done during the audit to prevent the situation.

In all likelihood, the auditor did nothing more than his or her job, and the employee's demotion would have occurred anyway, for reasons unrelated to the audit.  However, we must recognize that on occasion, attempts are made to use auditors to carry out personal agendas.  One of a performance auditor’s responsibilities is to describe the efficiency and effectiveness of agency operations and management’s discharge of duties.  In doing this, it is possible that either the legislature or agency will take action that may result in demotions, dismissals, or other personnel movements.

The solution(s)?  Following are some suggestions to consider when interviewing agency employees who are critical of upper management:

  • Avoid using comments from lower level employees about the performance of upper management.  Recognize that such comments in a published report place those employees between the auditor and upper management, which can have serious consequences on the lower level employee.
  • Recognize that employees may not have access to all the facts and information used by upper management in making decisions. While management’s decisions on program operations may be different from those employees would have made, they may not be illegal or even wrong decisions.
  • When possible, auditors should verify information provided by agency employees so the report reflects the auditor’s words, not those of the employees.  Avoid using, “According to agency staff…”  Audit findings should involve the auditor and agency management, not agency management and agency employees.  Agency management decisions should carry considerable weight in the absence of overwhelming proof (not to be confused with verbal accusations) that actions were not in a program’s best interest.
  • In situations when it is appropriate to quote lower level employees (for example, survey results of an agency training program), be careful to quote exactly what employees said. There is a difference between reporting that employees said they did not receive “adequate” training and reporting that employees said that they did not receive “enough” training.
  • Comments obtained during interviews and in response to an auditor asking, for example, “Is there anything else you want me to know?” should be put in perspective and not be assumed to be true.  If information is material to the audit objectives, it should be brought to the attention of the auditor-in-charge and included in the audit as an additional objective.  If at all possible, information should be presented to other knowledgeable employees for comment.
  • Upper management should be made aware of potential audit findings during audit fieldwork. Findings based solely on employee comments or opinions are subject to upper management’s rebutting that they have no idea what the auditor is talking about.  In addition, lower level employees may say the auditor must have misunderstood what they meant. At that point, an entire finding can be lost and the auditor may be left with little else to report.

The upshot:  Auditor comments should be just that, “the auditor’s comments.”  Audit findings, when possible, should be based on auditors’ actions to verify comments made by agency employees, not solely on employees’ stated opinions.

Written by (unnamed) staff of the Florida Office of the Auditor General. This article first appeared in the September 1992 issue of the NLPES News.


A Short History of How Our IT Security Audits Evolved | Katrin Osterhaus (Kansas)

The Kansas Legislative Division of Post Audit employs 25 staff, including an administrative officer and an IT technician.  Working alone, one of our staff started performing IT audits on various state agencies in 2003.  Between 2003 and 2009 those audits often made the news and called attention to a subject matter that had been largely ignored. By June 2010, we lost not only our one IT staff auditor, but our Post Auditor, Deputy Post Auditor, and Financial Audit Manager all retired.

In an effort to revitalize the IT audit function, our new Post Auditor obtained input and direction from our Legislative Post Audit Committee.  Again, we were scrambling to produce IT reports based on a three-year plan – except we had no audit team assigned to that work. We managed to steal an IT policy guru from our Department of Revenue, and together with a few seasoned performance auditors, they embarked on a new era of IT audits.

The team’s first report was released in July 2011.  The audit covered five agencies, and focused on their personnel-related security policies and procedures.  To no one’s surprise, we found several agencies did not conduct adequate background checks, and none consistently trained employees on security awareness or acceptable use of IT resources.  We also found that the entity in charge of coordinating the state’s IT standards (the Information Technology Executive Council, or ITEC) did not communicate those standards to agencies.

Once the team had begun, findings piled up quickly.  In December 2011, we released an audit showing that several agencies had significant vulnerabilities because they didn’t adequately patch their workstations.  In 2012 and 2013, the team worked on year-long reports released each December.  Each report studied 8 to 9 agencies and covered additional aspects of IT security, such as password control, patching processes for workstations and servers, IT security training, continuity of operations in the event of emergencies, anti-virus processes, security management process, and IT inventory.

In January 2014 we began a new 3-year audit plan.  We started with a statewide audit that included information about the types, volume, and variety of sensitive data agencies have.  We also made changes to our IT audit process based on agency feedback and lessons learned.  For example, agencies indicated they were often unclear what our criteria were, and concerns were raised about our barrage of requests for information that trickled in over a prolonged period of time.  Lessons learned focused on the process itself:  for instance, our previous process was too time consuming.  We spent too much time writing up findings and synthesizing them across several layers of workpapers.  We also spent too much time creating our reports.  Our IT reports had been patterned after regular performance audits—with fully developed thesis statements, paragraphs, and our normal extensive review—despite the fact that their distribution was very limited and readers did not get to keep copies of the confidential reports.

We decided to revamp our process by expanding the IT security areas we reviewed from seven or eight areas to 20. We now evaluate roughly 100 different items, mostly specific requirements set out by ITEC plus a few best practices. We evaluate many of these requirements at an agency-wide level.  We also evaluate specific IT security controls for one application that holds sensitive information per agency. Our IT audit process now takes the following steps:

  • Phase I – Agencies complete a self-assessment regarding compliance with requirements and best practices. This allows them to understand what we will be auditing them against, and gives them a chance to self-evaluate.
  • Phase II – We hold an in-depth interview to learn about the agency’s IT function and ask follow up questions based on their self-assessment. 
  • Phase III – We conduct fieldwork onsite. We review policies and procedures, training files for employees, computer screenshots, and look at other processes or documentation to determine whether the agency is in compliance with ITEC requirements and best practices.  We scan workstations and servers to determine whether the agency adequately patches its machines to prevent known vulnerabilities. 
  • Phase IV – We document, discuss, and synthesize our findings, which occasionally require us to follow up with agencies. We evaluate what level of severity we think the problem presents. Severity levels range from critical risks to technical findings. A critical risk is a vulnerability that creates an imminent threat for data loss or theft and should be addressed immediately. A technical finding is a weakness in an agency’s documentation or security process that is unlikely to lead to present or future vulnerabilities. We then communicate our preliminary findings to the agency in an exit interview. 

Finally, we revamped the audit report itself.  Individual agency reports are still confidential, but now they largely consist of finding sheets we compile as part of our audit. Reports include a paragraph summarizing major findings and some background on each agency. This has saved the team a lot of writing time while still providing important information to the agency and our committee. Our newfound efficiencies have allowed us to double the number of agencies we audit and triple the scope of each audit. On average, our IT audits take about 25 days; however, we have several going on at once covering several months.

Katrin Osterhaus is a principal auditor with the Kansas legislative Division of Post Audit. She can be reached at Katrin.Osterhaus@lpa.ks.gov.


PDS 2014-Raleigh Reflections | Dale Carlson (California)

"Earn this!"

I hope I'm not spoiling anything for anyone who happens to have not yet seen Steven Spielberg's 1998 classic Saving Private Ryan.  At a poignant moment in the film, Captain Miller (Tom Hanks' character) tells Private Ryan (Matt Damon's character), "Earn this!"

Am I really connecting this great film moment to NLPES' Professional Development Seminar (PDS)—you know, the one held in early October 2014, in Raleigh, N.C., where about 120 program evaluation and audit staff from more than half the states met for two-and-a-half days?  The short answer is yes; yes I am.  Please bear with me as I explain.

I've always understood Captain Miller's statement to mean, "Do something worthwhile to merit the effort and cost of what we did."  This same concept applies to the PDS. In this scenario, I play the Private Ryan role while my office plays the Captain Miller role.  In other words, what can I do to "earn this" in terms of the effort and costs my office incurred sending me to the Raleigh '14 PDS?

So what did I do to "earn this"?  What benefits did I earn based on my office's investment in Raleigh '14? ...benefits for my office? ...benefits for me?  Arguably, I probably could offer many different answers.  Instead, I’m going to limit it to two.

The obvious first benefit is training. I attended nine training sessions covering a variety of topics.  From those, I earned about 14 hours of CPE to count toward my 80-hours-every-two-years requirement.  That's not too bad.  One block of training, and I knock out nearly 20 percent of my two-year training requirement.  Based on just this answer, however, I doubt that my office would be convinced that I've “earned this.”

A not-so-obvious (perhaps) second benefit—but in my mind, the better one—is "takeaways".  Stated another way, what messages or information did I bring back to my office to help it improve the way it does things and/or to help me improve the way I do my job?  And by improve, I mean the "faster, better, cheaper" mantra that underlies almost everything we do.

Here are just six of the many takeaways I gathered from Raleigh '14:

  • Washington’s Joint Legislative Audit & Review Committee (JLARC) issued its first online-only report in September 2014.  This online version is scalable, providing options to readers regarding the depth of detail provided.  You can find JLARC’s report here.
  • Hawaii’s Office of the Auditor has a very cool brochure—What to Expect During an Audit—that it distributes to auditees.
  • Legislative members are getting younger, especially in term-limited states, and they’re more tech-savvy than their predecessors.
  • Many, many states have the same worry about how to issue shorter reports that still provide meaningful information to the different types of stakeholders who read their reports.
  • Social media is well on its way to becoming a huge professional communications tool.
  • BLUF (a U.S. Army acronym) is short for bottom line up front.  In other words, state your conclusion first and then give the supporting details later.

Clearly, some of my takeaways came from the training sessions.  The sessions I attended covered topics like working with multi-generational workforces, audit surveys, and audit agency efficiency and effectiveness.

My takeaways also came from networking with my evaluation colleagues from the other states.  At one point or another, I had conversations with delegates from nearly all states attending Raleigh '14.  I talked with them at breakfast, lunch, dinner, on breaks between sessions, and even while strolling from Raleigh's capitol complex back to the hotel. I’m always amazed at the different ways states perform essentially the same task of evaluating public sector agencies and communicate the results of what they find.  And to tell you the truth, I’m not above plagiarizing some of their practices to help me or my office improve.

While my networking conversations sometimes started with the "how's the wife/kids"-type topics, they invariably drifted toward work issues.  Issues like current, hot-topic audit concerns (e.g., rail transportation of crude oil); different processes for writing reports (e.g., early visioning); the pending "silver tsunami"; and tribulations of recruiting and hiring new evaluators to our offices.

Shortly after returning from the conference, I emailed a summary list of my Raleigh '14 takeaways to my executive office (EO).  I'd like to think that I gave my EO at least a nugget or two worth considering.  Perhaps my EO had already heard them before; perhaps not.  But at least I put the ball in play. As for me, I’m already pondering how I’m going to translate the takeaways into action within my own sphere.

My challenge to you is simple:  If you attended Raleigh '14...earn this!  In other words, make Raleigh '14 worth the effort and costs your offices incurred to send you there.  Share your takeaways with others in your own offices. Consider the new ideas you learned, implementing those that seem to fit with you or your office.  And for those ideas that don’t seem to fit, well, perhaps it’s time to consider doing some things differently.

See you at Denver '15.

Dale Carlson is the 2014-2015 Communications Subcommittee chair.  He can be reached at dalec@auditor.ca.gov.

news & Snippets

 


Stop the Presses!  Audit Shops in the News

Legislative program evaluation shops across the country regularly garner media attention for their work, an indicator that stakeholders are paying attention, and increasing the likelihood recommendations will be implemented and performance will improve.  The following list summarizes five audit reports in the media in September and October 2014.

TEXAS—OFFICE OF THE STATE AUDITOR
UNT should repay $75.6 million to state
Sept. 26, 2014 – NBC Dallas-Fort Worth

The University obtained excess funding by manipulating payroll expenditures in the Uniform Statewide Accounting System and paying employees with state funds when those employees were not eligible to be paid with state funds (September 2014). Full report and other video articles.

CALIFORNIA—OFFICE OF THE STATE AUDITOR
California audit calls for better use of rape kits
Oct. 10, 2014 – NBC San Diego

The agencies allow their investigators to use their discretion in making decisions about whether to request a kit analysis based on the specific circumstances of the individual case in place of formal policies (October 2014).  Full report and other video articles.

IOWA—OFFICE OF THE STATE AUDITOR
State audit finds credit card misused by former Monona County Auditor
Oct. 9, 2014 – RadioIowa.com

The $7,119.44 of improper charges includes purchases at convenience stores, retail vendors, Winnavegas Casino, and an Apple iTunes Store. The purchases also include 10 purchases of $100.00 debit cards from Wal-Mart (October 2014). Full report and other articles.

UTAH—OFFICE OF THE STATE AUDITOR
Audit finds oversight lacking in $600 million Utah business-incentive program
Oct. 15, 2014 – FoxBusiness.com

Though advised by a board of industry professionals, the Governor’s Office of Economic Development’s executive director has sole authority to authorize incentives with minimal oversight (October 2014). Full report and other video articles.

MINNESOTA—OFFICE OF THE LEGISLATIVE AUDITOR
Audit finds MNsure missteps with marketing work
Oct. 28, 2014 – The Washington Times

MNsure did not appropriately authorize $925,458 of additional marketing work or execute a contract amendment until after the contractor completed work (October 2014). Full report and other articles.

Share your coverage with us!  If you would like an article highlighted in our next newsletter, send a hyperlink to rhibbard@auditor.state.hi.us. 


Web Links

NLPES websiteLearn more about NLPES and see what we do by spending a few moments touring our NLPES website.  You’ll find general information about the NLPES, including by-laws, executive committee membership and subcommittees, state contacts, awards, and information on peer review.  We also have a training library and resources including past meeting minutes, newsletters, and more. Check out our website resources!

Image to represent emailNLPES listserv—The NLPES listserv is an email discussion group for NLPES members. By sending a message to nlpes-l@lists.ncsl.org, you can reach all listserv subscribers simultaneously.  Listserv members:

  • Are able to query other states about evaluation work similar to their own current projects
  • Receive announcements about performance evaluation reports and job opportunities from other states, and
  • Are notified when the latest edition of this newsletter is available!

To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.”  Include your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address.  A “Welcome” message will be sent to you once you are successfully added to the listserv.

See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.  You’ll be glad you joined!

Legislative careers website—Know a young professional thinking about pursuing a career with a state legislature? Point them to the opportunities posted NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support.  Opportunities are posted by states offering positions under Legislative Jobs.  Launched by NCSL in June 2012, this is a great website.  According to NCSL, attracting young people to work as legislative staff will be increasingly important in the coming years.  And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older.  Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career. 

Image of a stack of booksOnline training library: NLPES training products matrixFor a variety of refresher and training materials, visit our NLPES online training library, where there is a wealth of resources on critical thinking, findings savings, interviews, quantitative methods, samples, survey development, reviewing contracts, effective presentations, report writing, and various management topics.
 

Image for "Ask GAO Live"Ask GAO LiveHave you seen this website?  We just discovered AskGAOLive, a 30-minute interface where GAO staff chat about a specific report and research, and answer questions that are emailed or tweeted in. Sessions are recorded and archived on the website. You can also “follow” GAOLive to receive advance notice of chat sessions. Topics include veterans and higher education, prescription drug shortages, prison overcrowding, state and local fiscal outlook, and government contracting.
 

Image of person looking through magnifying glassTraining from the NLPES online library: rapid response researchAs sessions start up across the country, many NLPES offices will receive urgent requests for research assistance.  In response, we’ll provide a range of products from verbal briefings to memoranda to published reports.  For a quick refresher on how to handle rapid response assignments, please visit our NLPES online training library.  The narrated Power Point on rapid response assistance is a brief 5 minutes of good advice put together by Mississippi PEER.
 

 

 

 


The Lighter Side: The the Impotence of Proofreading | Taylor Mali*

Has this ever happened to you?
You work very horde on a paper for English clash
And then get a very glow raid (like a D or even a D=)
and all because you are the word1s liverwurst spoiler.
Proofreading your peppers is a matter of the the utmost impotence.

This is a problem that affects manly, manly students.
I myself was such a bed spiller once upon a term
that my English teacher in my sophomoric year,
Mrs. Myth, said I would never get into a good colleague.
And that1s all I wanted, just to get into a good colleague.
Not just anal community colleague,
because I wouldn1t be happy at anal community colleague.
I needed a place that would offer me intellectual simulation,
I really need to be challenged, challenged dentally.
I know this makes me sound like a stereo,
but I really wanted to go to an ivory legal collegue.
So I needed to improvement
or gone would be my dream of going to Harvard, Jail, or Prison
(in Prison, New Jersey).

*Excerpt only.   See entire poem.


Upcoming Events

This is a quiet time of year for the newsletter as we all gear up for legislative sessions and focus on our day jobs.  However, the NLPES Executive Committee continues its work behind the scenes and will meet briefly in the spring.  We'll keep you apprised of news and issues viea the NLEPS listserv, and look for the Spring 2015 edition of The Working Paper in May 2015! 

2015 NLPES Professional Development Seminarwill be held in Denver in October. Details and exact dates will be shared when they are available.


The Working Paper is published three times a year by the National Legislative Program Evaluation Society, a staff section of the National Conference of State Legislatures. NLPES serves the professionals of state legislative agencies engaged in government program evaluation.The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.

The Working Paper is produced by the NLPES Communications Subcommittee.
Dale Carlson, 2014-2015 chair (CA)
Rachel Hibbard, newsletter editor (HI)


NCSL Liaison to NLPES
Brenda Erickson, (303) 856-1391
NCSL Denver Office • (303) 364-7700