NLPES Question of the Month

January-February 2006:

HOW DOES YOUR AGENCY FOLLOW UP ON THE RECOMMENDATIONS OF PREVIOUS REPORTS?  (For example, do you systematically determine whether an agency has implemented your recommendations?  If so, when does this occur, and do you convey this to legislators?  In what circumstances do you conduct new research to verify what changes have occurred?)



Rick Riggs, Kansas

Each spring, I contact agencies we audited in the previous calendar year, and ask for an update on what they've done to implement the audit recommendations.  I ask for documentation (e.g., copies of new policies), but we do no additional audit work.  By the rules of our Committee, the report resulting from this work has to be published by September 1 of each year.

 



Jennifer Jones, Texas Sunset Advisory Commission

The Texas Sunset Advisory Commission uses compliance reviews to determine how well agencies are implementing the recommendations in their Sunset legislation from the previous legislative session.  For these compliance reviews, staff prepare an implementation chart listing each provision of the Sunset bill. Agencies must complete these charts, explaining how the provisions have been implemented and submit supporting documentation. Staff analyze the agencies' responses to verify the information, which may require asking for additional information or meeting with agency staff.

Staff report initial compliance results to the Sunset Commission during one of their last meetings before the legislative session begins.  Staff publish final compliance results in the Report to the Legislature just after the start of session following the one in which the Sunset bills passed.

 



Stephanie Hoffman, Washington

Good question because our office has just been thinking about how to improve our process for following up on recommendations.

For a number years, we have done semiannual tallies of the status of our audit/study recommendations.  We maintain a database of all our report recommendations and ask analysts to periodically update the database with one of 7 "status" categories. The 7 categories are: Implemented, Underlying Issue Addressed, Action in Progress, Partial Implementation, Not Implemented to Date, Situation Changed and Recommendation is No Longer Valid, and Recent Reports with Completion Dates in the Future).

Analysts determine the appropriate recommendation status category in several ways:  Ideally, our staff will have been asked to do a formal follow-up report that allows us to review the status of the recommendations in depth.  If this is not the case, we also rely on written progress reports from agencies that may have been required as part of our audits/studies.  Otherwise, we have to rely on phone conversations and email exchanges to determine whether or not an agency has implemented a recommendation.  When this happens, we make sure to clarify on our status reports that this follow-up information has not been verified using standard audit procedures.

Once analysts have completed this semiannual (or sometimes annual) update, we issue a report to our legislative committee that shows the percentage of recommendations that have or have not been implemented by the Legislature or state agencies.  We typically roll the 7 status categories into 4 or 5 broad categories.  Our latest report shows the status of recommendations from the last 2 years but we have in the past gone back farther.

We are considering standardizing this process even more by sending out regular letters to all agencies that have been audited asking them for a status update so that we have something in writing to work from when we prepare follow-up reports.

 



Barbara Rogers, Wyoming

In Wyoming, we began doing routine follow-ups about 15 years ago.  It was apparent back then that agencies were breathing a huge sigh of relief once they got through the meeting with the legislative committee and our report on them had been released.  There was a sense that they needed to do no more than just survive the initial ordeal.

What our committee, the Management Audit Committee, instituted is a routine follow-up two years after the release of a report, at which time we ask the agency to provide a written account of the actions they've taken with regard to each recommendation from the report, and the results they are seeing.  Sometimes we give the Committee the agency's information as submitted, but usually we re-write it to simplify it, make it more clear, and add our own summary of the context, questions, and concerns.  In any case, we do not do independent work to verify the validity of what they claim.

The agency then meets with the Committee in executive session to discuss the information.  The thinking is that if the Committee is not persuaded by what they hear and read, they can assign staff to revisit the topic in a new study of the same issue, or perhaps ask us to look at a related but slightly different angle of the subject matter.

When a report is first considered and released, if an agency appears to be dragging its feet or, shall we say, is reluctant to work on the recommendations, the Committee may ask them to testify at an upcoming meeting about the progress they are making.  Or the Committee asks for a written update in a few months.  These options occur in advance of the two-year follow-up and they don't take its place; they act as a nudge or reminder to the agency that there's more to come.

My read on the follow-up process is that the Committee finds them very satisfying in the sense that there's a feeling of completion about them:  problems have been pointed out, recommendations were made to fix those problems, and sometimes/often the fixes have taken place.  This makes the program evaluation process more of a loop than many other legislative activities, which often involve open-ended, ongoing issues. The process also spurs agencies to actually do something other than just aim to live through the pain of the evaluation process.

Unlike some other states, our group has not attempted to put a price tag on the dollar savings that our recommendations could generate.  In part this is because our recommendations tend to be heavily policy-oriented and any savings associated with implementing them ("The Legislature's and Governor's designated lead entity and collaborating partners need to adopt one comprehensive substance abuse plan.") are not easily determinable.  Also in part, some of our recommendations provide the impetus to spend more money than has been the case.  For example, we recently recommended that the Legislature consider raising the monthly stipend paid to foster parents - an amount that hasn't been increased in 25 years—in order to grow the ranks of foster parents, which in turn would provide alternatives that could help avoid costly and inappropriate institutional placements for children.

Other states sometimes tally up the percentage of recommendations that have been implemented and report that as a measure of their success.  We have not done that, and I hope to learn from the other responses to this month's question how that's done and whether it pays off for them.

 



Beth Ashcroft, Maine

Although OPEGA is just getting started (we’ve only released 3 reports so far), we have initiated a formal follow-up process that we feel will be critical to the value we add to the State.  It basically works like this:

Throughout the review, we discuss with agency management any findings/observations we have identified for which we recommend management action(s).  Our goal is to get management to commit to actions that we can include in OPEGA’s final report for the review.  For each action management agrees to take, we ask them to specify the individual who is responsible for taking that action as well as a date by which the action will be taken.

We log the specifics of each finding/observation, the related recommendations and management actions into a database we are maintaining.  We use the database to trigger follow-up based on the expected due date for each action.  Once the original due date has been reached, OPEGA sends a status update form to the person identified as the responsible manager.  The memo reminds the manager of the finding/observation and the management action that was committed to.  The memo also has a bottom section that asks the manager to check off whether or not the action has been taken.  If the manager is checking “Yes”, then he/she is asked to provide supporting documentation to show that the action has been taken.  If the manager is checking “No”, the manager is asked to report why the action has not been taken and to provide a new due date.  The manager is asked to sign and date the form and return it to OPEGA.

OPEGA uses the information on the form to update our findings database.  If we feel comfortable that action was indeed taken, we move the finding/observation status to “closed”.  If the action has not been taken, we log the manager’s explanation and establish the new due date which will then trigger our next round of follow-up.

On a quarterly basis, OPEGA will review with our Government Oversight Committee the status of all findings/observations that have been closed in that quarter or that are still open via a report.  We expect that this report will also be provided to affected agencies, the Joint Standing Committees of jurisdiction for the affected agencies, and the Governor’s office.  We also expect that the Government Oversight Committee may call responsible managers in to address the Committee about long-standing open findings that do not seem to be getting resolved.

Depending on the nature of the finding/observation, OPEGA may decide to do a more thorough verification that the action management has reported was taken has indeed been implemented and is effective.  Also, whenever we do a future review in an agency, we will check our database to identify past findings and actions that should be revisited.

While most of the focus may be on management actions, OPEGA does also plan to follow up on whether recommended legislative actions have been taken.  This may be a bit more difficult in terms of process but our goal will be to keep these recommendations in the public eye until legislators decide whether or not to take any action on them.

 



James Barber, Mississippi

The Mississippi PEER Committee has a policy of conducting follow-ups of reports six months after the reports' official release.  PEER  staff sends a letter to the reviewed agency asking staff to describe specific actions taken by the agency to implement recommendations  contained in the original  report.  PEER staff present the agency's six-month follow-up response to the PEER Committee.  The Committee considers the agency's response and decides whether to accept the  response as positive action on the part of the agency or to authorize  additional work by PEER staff at the reviewed agency.  Typically, the  PEER Committee accepts the agency response as presented and places  the response in PEER files for distribution only upon request of a  legislator or a citizen.  On a few occasions, the PEER Committee has conducted six-, twelve- and eighteen-month follow-ups on some agencies because the Committee did not believe that the agency had taken positive steps to implement the Committee's recommendations.

 



Ashley Colvin, Virginia

By statute, JLARC is required to conduct a systematic follow-up of its work once each biennium.  The resulting product, known as the Report to the General Assembly, includes a summary of significant actions taken by executive agencies in response to reports and recommendations previously issued.  As part of this process, agencies are requested to file “status-of-action” reports on their efforts to address the Commission’s findings and recommendations.  Special follow-up studies are required in cases where the Commission has cited waste, extravagance, fraud, or misuse of public funds.

 



Gerald Schwandt, Michigan

It is the policy of the Office of the Auditor General that all prior audit findings and recommendations be followed-up, in accordance with professional standards, at the time of the next audit of the same type (e.g., performance, financial, and financial related) of the agency or program. When the audit frequency of an agency is expected to exceed four years, a follow-up of material audit findings and recommendations will be performed approximately six months after the date of compliance agreed to by the auditee.  Follow-up reviews of material findings and recommendations are published as a normal self-standing public document.  Follow-up reviews of other findings are reported on in the subsequent report on the agency or program.

 



John Sylvia, West Virginia

West Virginia routinely conducts "updates" of previous audits to report on the agency's response to our recommendations.  Our sunset law was amended several years ago to mandate an update process.  We report to the Legislature whether an agency is in compliance, non-compliance, or planned compliance with each recommendation.  During an update we also have authority under the sunset law to do "further inquiry" if there is a need.  In other words, we can look at other areas in addition to the issues we are updating.

 



Sylvia Hensley, California

At the California Bureau of State Audits, we have a process for determining and tracking an auditee's progress in implementing our recommendations. Attached to the final audit report we send to the auditee is a letter requesting that the auditee report to us at 60-days, six months, and one year on its efforts to implement our recommendations.  We request that the auditee's response include a timetable for implementing our recommendations; the name of the person or persons who will be responsible for implementation; and the rules, memoranda, and other relevant materials that document the implementation of the recommendations or the steps taken to rectify those problems discussed in our report.

The project manager and the team lead are responsible for reviewing the response information the auditee provides.  This information is then used to determine the need for a follow-up review by the state auditor or, in some cases, the need for a committee hearing. The project manager and team lead are also responsible for evaluating and summarizing the auditee's progress in implementing our recommendations for the annual report we provide to the relevant legislative policy committees and the budget subcommittees at the beginning of both the legislative year and the budget cycle to facilitate legislative oversight of audited agencies.

 



Mike Paoni, Illinois

We follow up on all the recommendations made in our performance audits as part of our annual compliance examinations of each state agency.
 

Joel Alter, Minnesota

Our office has used sort of an ad hoc approach, rather than a rigid schedule, to follow up on the recommendations of past reports.  In 2005 (when the Legislature was approving biennial budgets for agencies), we put together one-page “impact updates” for legislators on selected past reports.  In these updates, we briefly summarized (1) problems our office identified in the original reports, (2) changes implemented since the reports were issued (either by the Legislature or the affected agencies), and (3) remaining issues that required legislative attention.  These updates were based on limited research by our staff (contacts with agencies, reviews of legislation, etc.), and we did not always have the time to verify the claims or data that agencies presented to us.  We distributed these impact updates to key committees that oversee the programs we had examined.  We also distributed a packet of these updates to our audit commission and to legislative committees that were considering our own budget.  Thus, these updates helped promote accountability and continued action regarding the programs we had examined, but they also helped demonstrate to legislators the impacts of our reports.

Over the years, our office has resisted the urge to keep “checklists” of recommendations we’ve made, or to compute the percentages of our recommendations that have been implemented.  We recognize that some recommendations in reports are more important than others, and we prefer to keep legislators focused on the “big picture” regarding the performance of an agency or program.

Occasionally, we ask agencies to provide us with written summaries of how they have responded to the recommendations in previous reports.  And, occasionally, we have initiated follow-up evaluations of programs we’ve previously reviewed.  But, if follow-ups would require significant time by our staff, we would probably seek approval of a “new” evaluation by our audit commission before proceeding.