NLPES Question of the Month

October-December 2005:

WHAT QUALITY CONTROL PROCEDURES DOES YOUR OFFICE USE TO ENSURE THE ACCURACY OF YOUR REPORTS?  (Who in your office reviews findings and drafts prior to release?  How do staff cross-reference statements in the report to the supporting evidence?  What changes have you made in the quality control process, and with what results?  Can the process be streamlined while still ensuring quality?)


John Sylvia, West Virginia

I was on the panel that discussed the issue of quality control during the 2005 NLPES fall conference. It was a very interesting discussion with many states involved.  There appears to be some variance in quality control among states (some of which is surprising to me).  One state indicated that its referencing process takes up to three weeks, while in West Virginia it normally takes three days, and other states do not reference the report at all. As I emphasized in my discussion, quality control does not occur at one point in the auditing process.  Our reports go through five stages of review, the research manager, the director, the legislative auditor, and the last two stages being the referencing procedure, and the agency's advance review of the draft report.

Here are points I made during the discussion:

1)  The agency's advance review (Yellow book standard) is very important in our quality control process.  (At least one state indicated that it does not seek the agency's review of the report.)

2)  Managers of an audit must show evidence of supervisory review of work papers (Yellow book standard).

3)  An independent referencer verifies every statement of fact in the report by highlighting each statement with alternating colors and referencing it against the work papers.

4)  We do not have staff strictly devoted to referencing.  Every staff person at some point will reference a report.

5)  We do not reference the executive summary.

6)  We discourage separate background sections unless it is warranted, this may partially explain why referencing takes longer in other states compared to West Virginia.

7)  The referencer issues a reference report that becomes part of the work papers.  This report must be reconciled if there are problems between the report and the work papers.
 



Phil Hopkins, Oregon

Your quality control process [in Arizona] seems very similar to ours.  We have independent referencers and every date, amount, law, fact and conclusion in the report must be referenced back to a working paper.  We feel this process is necessary and we don't consider it too time consuming considering the quality of the finished report.
An independent referencer could spend anywhere from a couple of hours to 20 or more depending on the complexities of the report.  We only use our (16) senior auditors as independent referencers and make it a point to spread the work around evenly.  One possible advantage we have over some audit shops is electronic workpapers.  It makes it a lot easier.  Nevertheless, attached is a couple of guidelines (not policy) we use for independent referencers.  The process is too complicated to answer by e-mail but you should feel free to call me for more details about our process if you would like to (503/986-2304).  (Editor’s note:  Phil attached the following items to his original e-mail:  a sample “Report Review Sheet,” a document called “Best Practices in Independent Referencing,” and a document called “Guidelines for Independent Referencers.”  We did not include these documents here, but you can get them from Phil at his e-mail address above.)

Sharon Candon, Colorado  (sharron.candon@state.co.us)

While referencing practices are often a time-consuming and somewhat irksome process, the Arizona process sounds just like the process that GAO uses for quality control. I worked at GAO until just recently, and every line of every report was traced back to an original source (either agency-provided document, record of observation and analysis, or interview record) to ensure that the facts of the report are accurate. In addition, for every numerical figure in the report, we "traced and verified" those numbers back to their sources, too. In GAO, these two processes could be separated, so that the referencing part went a little faster. An "inexperienced" auditor (who had at least 1 year of experience as an auditor) could do the tracing and verifying of the figures, so that when it came to the referencing part, the final figures were trustworthy. The referencing part was done by an "experienced" auditor -- one with at least 1 year experience and whose own work had been through at least one referencing so that he/she was familiar with the standards. About a year ago, GAO required all auditors to go through a "referencing refresher" course to be reminded of required standards of evidence for factual information stated in GAO reports.

GAO was moving from a paper-based indexing system to an electronic version using the comment feature in Word. This allowed for editing of the report to continue while it was being indexed. However, it is then helpful to use "track changes" to ensure that referencing points are addressed satisfactorily.

Here in Colorado, the team leader and audit managers are responsible for ensuring the factual accuracy of the report, and the quality control process requires that an auditor outside the team trace all numerical figures to source documents or "to their comfort levels" to ensure the numerical accuracy of the reports. This is basically similar to the GAO tracing and verifying process
 



John Buyce, New York

I have attached a couple of sections from our audit manual, which I hope are helpful to you.  (Editor’s note:  These sections are not included here, but they can be obtained from the author.)  They describe our workpaper standards and our quality assurance process.  In short, it can be summarized like this:

We don't require sentence-level referencing of the reports per se, but we do require that all substantive facts and conclusions be referenced back to the workpapers.  That means a single reference might suffice for an entire paragraph or two of discussion, if those facts and/or conclusions are all documented in the same place.

We also don't have an independent referencer verify each item.  We require a referenced report that has been reviewed and approved by the off-site audit supervisor.  Beyond that, we have a quality assurance staff consisting of two experienced in-charge auditors who work full-time on reviewing each report before it is issued in draft to the agency.  These auditors use a risk-based "triage" system to review each report and assess which areas / items / findings or conclusions have the greatest risk.  I think this is what Tom was referring to is his note.  This "risk" is assessed based on the extent that the items impact the conclusions in the report, the complexity of the issues being discussed or the steps being performed, and the general past experience with the staff and/or team that is doing the work.

We also have different levels of review that we perform based on the type of work that is going out.  For example, full scope performance audits get the full triage review, while follow-up audits get a much lower level of scrutiny.  In the middle are our TAP audits, which are routine audits of financial aid payments based on standard statistical samples.  We have a tailored review for these repetitive assignments.  I've attached the standard checklists that we use in the QA process to basically accept or reject each type of report submitted for review.  If these basic elements are not present, we kick the report back to the team at that point.

Our review process is then to use our TeamMate program to create Coaching Notes directed to the audit supervisor to clarify or correct any items that the reviewers find to be unclear or inaccurate.  When all of those notes are addressed and cleared, the report is cleared for release as a draft.  Teams are then responsible for notifying QA if there are "significant" changes between the draft and the final report versions.  If so, we will also review and sign off on those changes before the final is issued.

I'd be more than happy to discuss any of this with you on the phone if you like.  You can reach me at (518) 474-3271.
 



Jettie Sparks, Kentucky

Kentucky's Auditor of Public Accounts' Division of Performance Audit has the same quality control process.  We take the report draft and line out the sections related to a specific workpaper.  If an entire paragraph came from the same workpaper, then a line is drawn under the paragraph and out to the side the reference is provided.  In the beginning, the process took more time because we were zealots that everything had to be in a workpaper.  We have relaxed this some and allow "Auditor's Opinion" or "Auditor's Opinion based on ___ workpaper" to be used for those sentences that are transitional or are meant to summarize our conclusions.  Before this, there were reviewers who wanted a workpaper reference for everything and auditors were creating new workpapers just to document conclusions and recommendations.

Recently, we had our most experienced auditor review all of the "Auditor's Opinion" references to ensure that this was an allowable reference or whether more support references should be provided.  A lesser experienced auditor checked the sections referenced to actual workpapers.  We have also attempted to not reference a draft until it is very close to final.  If the referencing is done on an early draft, we have to go back and determine where the changes are made to reference any new additions.
 



Jim Pellegrini, Montana

The process you are using looks a lot like our old process of referencing.  Since we have gone to electronic working papers using Excel and Word as our basic programs, the reviewer has hyperlinks that takes them to the required working papers.  However, this is not the only change.

We are not referencing every line.  We are probably using a risked-based system.  Not all recommendations and conclusions are independently traced back to supporting documentation.  Major areas are identified. Background information is more generally referenced rather than specifically referenced.  The report reference does not have to go to a specific line in a working paper.  If that reference takes you to a summary, conclusion, or an Audit Point Sheet (APS), then that is fine. The summary, conclusion and APS will have the necessary support to other working papers.  We made a decision that the referencing is to be sufficient enough to support the conclusion or finding, not make it easier for a reviewer.

A person on the audit team (called the Project Oversight - not the independent reviewer) is responsible for ensuring that working papers are reviewed and referenced.  It is their responsibility to ensure all findings and conclusions are supported.  The risk is placed on the team members, not on the independent reviewer.  The Quality Control system is embedded more in the team.  Report reviews (those terrible meetings where people ask you questions) also add to the QC process - which reduces the risk of unsupported findings.

We just had a contractor review three of our reports with the purpose of tracing objectives to conclusions and findings.  He started with the report and worked back to the support.  His recommendation was that we use an independent reviewer at the time the report goes to exit conference.  He also recommended we use an even more risk-based approach and not even select all reports for independent review - a concept that scares some auditors.  His point being that the internal review process is sufficient and the risk is low for some reports.  Then on reports that have major issues - review the major issues - concentrating on sufficient level of support.  The referencing he found was made much easier with the hyperlinks.

I guess the bottom line was to increase our risk-appetite, and rely upon the QC process imbedded in the team to ensure standards are met and sufficient support is available.

The following notes the policies and procedures used in guiding our referencing.  You will note that we have gone to a very general approach based on the Yellow Book and the expertise and experience of the audit team.  We do not even use the word risk, but rely on the term significant to guide the process.  The first two areas of guidance are presented in the Yellow Book under the Standards for report quality and contents.

8.45:  Evidence included in audit reports should demonstrate the correctness and reasonableness of the matters reported.  Correct portrayal means describing accurately the scope and methodology and presenting findings and conclusions in a manner consistent with the scope of audit work.  The report also should not have errors in logic and reasoning.  One way to help ensure that the audit report meets these reporting standards is to use a quality control process such as referencing.  Referencing is a process in which an experienced auditor who is independent of the audit verifies that statements of fact, figures, and dates are correctly reported, that the findings are adequately supported by the documentation, and that the conclusions and recommendations flow logically from the support.  (Note:  Reading this guidance carefully, we began our approach of focusing on conclusions and recommendations and ensuring the support is there, even if we generally reference to the support.  The key is that it is factual and logical.)

8.13:  The audit report should provide selective background information to provide the context for the overall message and to help the reader understand the findings and significance of the issues discussed.  Appropriate background information may include information on how programs and operations work; the significance of programs and operations; a description of the audited entity's responsibility; an explanation of terms, organizational structure, and the statutory basis for the programs and operations.  (Note: Again, we used the guidance to focus on significance and concentrate on background material that will be used to help the reader understand the issues.  We usually have a general reference to the planning documents since they contain much of the information.)

From this guidance the following sections are included in our Audit Manual as they relate to performance audits.

• Results of each audit test must be taken to their logical end, i.e., a conclusion on the item tested. The conclusion must follow logically from the support. Referencing to various procedures used, the results of tests, and/or agency responses to questions can help support your conclusion.

• Each section of the final report should be referenced to the conclusions or other supporting working papers.  (Note: We have been general in our description.  We require that each section be referenced.  We are not implying that every line be referenced.)

• Ensure all plan/program steps are referenced or hyperlinked to appropriate working papers.  (Note: This does not refer to report referencing, but to ensuring that steps are completed.)

• The reference report file should contain (references) hyperlinks to supporting documentation for all significant findings, conclusions and recommendations included in the report. As it is with a referenced hard-copy report, it is the responsibility of the in-charge-auditor to ensure all these links (references) function correctly prior to archiving the working papers.  (Note: With the use of electronic working papers we find it much easier to link (reference) the needed material to the report because the findings, etc, are linked to the working papers. In fact, while writing the report the auditor can reference sections of the draft report to the working papers through the use of hyperlinks. This facilitates easier referencing of the final document.)
 



Melissa Wenrich, Pennsylvania

The Pennsylvania Department of the Auditor General generally has the audit staff who worked on the engagement cross-reference the audit report contents to the audit documentation.  The audit supervisor and/or manager assigned to the audit then verify that the audit report is supported by adequate documentation.  We do not necessarily reference every line of the report, but will often reference paragraphs or an entire finding to supporting documentation (such as a summary memo).  We feel that our review process adequately ensures the level of evidence necessary to support the accuracy of the report content.
 



Jane Thesing, South Carolina

The South Carolina Legislative Audit Council quality control process has several components:

• The audit manager or supervisor works with the auditor on a draft finding until it is substantially complete.
• Although not required by policy, often within teams, drafts are circulated among all the auditors for comments (and to share knowledge of the developing report).
• Once the pieces of the report are put together to make a complete draft, this draft is reviewed by the other audit managers for content and readability, and by the LAC legal counsel for correctness of any legal citations or legal analysis.
• Although not required by policy, usually the audit team meets jointly to review the comments of managers and legal counsel and put together a revised draft.
• Occurring concurrently with review by the managers, auditors have their work reviewed for accuracy in a process called indexing and referencing.  Each fact in the report draft must be indexed to supporting evidence in the audit workpapers, and this is reviewed (referenced) by another auditor not on the same audit team, who makes points where the evidence needs clarification or factual statements are in error. (In a separate process, all audit workpapers have been reviewed by the auditor’s supervisor—in a process called supervisory review.)
• When all the findings have been referenced, the draft is then reviewed by the LAC director, who may suggest further edits.
• The draft approved by the director is sent to our governing board (the council) for their review.
• The same draft is sent to the audited agency for preliminary comments.
• The staff may make revisions based on the agency comments, and prepare responses to the agency comments which are also sent to the council.
• The council has a meeting to review and approve the report for publication.
• The final draft is resubmitted to the agency, which can prepare final comments that are published with the report.

We have not changed this process significantly in many years.  While it is time consuming, we have observed that the reports keep getting better until they are published.  We devote a great deal of care to complying with audit standards and publishing reports that are factually correct and well written.
 



Rick Riggs, Kansas

In Kansas, we don't do a separate pre-publication referencing process.  Instead, as workpapers are completed, the audit supervisor reviews them to ensure they meet standards, that they document evidence that is sufficient, relevant, competent, etc.  The manager then reviews the workpapers, ensuring that the supervisor has done (and has  provided evidence of having done) the appropriate review.  While the draft is being polished and formatted, the audit team cross-references the numbers and facts to the workpapers.  The supervisor then reviews and checks the cross-referencing.

After the report is made public, the workpapers are reviewed by the designated quality-control person, who fills out a copy of the NSAA peer review workpaper checklist.  The results of this review are used for training purposes, and as management information.

We've used this system for many years; it provides an acceptable level of quality assurance while still allowing us to meet the needs of our legislators more quickly.
 



Zeny Nace, Guam

Suggestions of ways to streamline the indexing and referencing process:

1. Aim for a more concise reporting, thereby limiting the number of pages to index or cross-reference.
• Survey/fact sheet – limit to one page
• Planning/initiation memo – limit to one page
• Sampling Plan-only needed when transaction testing(very limited samples) is necessary to support tentative findings generated during the completion of control risk assessment, internal control checklists and questionnaires
• Survey report – limit to a maximum of 2 pages including tentative findings and conclusion why it is a “go” or “no go” audit.

2. Limit the generation of report draft to cut down on review process.
• Quality Assurance Review Report (QAR)– original indexing by preparer
• Clean Draft Report – use a clean QAR Report to submit to auditee for their responses
(Note: an interim report may be generated for discussion with auditee but
discarded after finalizing any agreed upon changes to the report)

3. Cross indexing between work papers and the Report can be referenced NOT by a sentence by sentence indexing but simply using “R” indicating that such has been referenced to the Report.  This eliminates redundancy of cross-referencing already done within the final report draft.

4. Limit report contents to prescribed standards in the Yellow Book. This means that work papers need to be concise but still meet the standards for planning, objective, scope, methodology, testing criteria, and conclusion as appropriate.

5. Ensure that excerpted sources are documented only to the extent of necessary pages to support the work paper.  This will avoid unnecessary reading by the reviewer.
 



Sylvia Hensley, California

At the California Bureau of State Audits, we employ a variety of procedures throughout the audit to ensure the accuracy of our reports.  For example, workpapers are subject to two levels of review, a detail review by the lead auditor and a high level review by the project manager.  In addition, executive management (the State Auditor and/or Chief Deputy State Auditor, and the Deputy State Auditor) meets with the audit team periodically during scoping and fieldwork to ensure among other things that the procedures and methodologies the team employs are adequate and that the team's interpretation of the audit results as well as proposed recommendations are reasonable.  Executive management also reviews the report draft at various stages, including at a minimum the first draft, the draft that goes to the auditee for review and comment, and the final draft before issuance to the public, to ensure the facts are logically presented and the conclusions supported.  Additionally, the audit team briefs the auditees and seeks their input at various points in the audit process to ensure the team has all the facts and fully understands the issues, including the auditee's point of view.

Although all of the above processes are important to ensuring accuracy, the method we most associate with report accuracy is what we refer to as "risk review." Risk review takes place before the audit report is given to the auditee for review and comment. Once the draft report is ready, the team members index each sentence in the report to the supporting workpapers, drawing a red line at the end of the relevant report passage and recording the supporting workpaper's number in the margin next to the passage. Generally, it is the project manager's responsibility to make sure that the referenced workpapers provide adequate support. (Depending upon the size and complexity of the report, a project manager or supervisor who was not assigned to the audit may assist with risk review.)

If the project manager is not satisfied that the referenced workpaper provides adequate support for the statement in the report, he or she writes a note explaining why the referenced workpaper is not sufficient.  The audit team addresses each of these notes, which generally includes providing additional workpaper references, doing additional work, or changing the report. Once satisfied that the team has satisfactorily addressed his/her concern, the project manager clears the note.  Once all notes have been cleared risk review is complete.

Over the years we have made few changes in the risk review process.  Specifically, the audit team has always indexed the report to the supporting workpapers and the "risk reviewer" has checked the referenced workpapers to ensure they provided adequate support.  However, we have changed who acts as the risk reviewer.  In the past we would assign a supervisor who was not involved with the audit to be the risk reviewer.  The perceived advantage was that someone independent of the audit team would be better able to spot errors or flaws in logic.  However, there were a number of disadvantages to this approach, including the time it took for the risk reviewer to get up to speed on the issues and the adversarial and occasionally dysfunctional relationship that sometimes developed between the team and the risk reviewer. We believe our current practice of having the project manager act as risk reviewer strikes a good balance.  The project manager is far enough away from the day-to-day supervision to be objective yet near enough to have a good understanding of the audit issues. We believe that this change has saved time without compromising accuracy.
 


Keenan Konopaski, Washington

Though we do a series of supervisory reviews, one additional technique we use to review findings in Washington is what we call a whiteboard session. Ideally, the audit analysts will use this as a pre-writing brainstorming session. The project team is expected to do an informal presentation about how they would describe their findings and thoughts on recommendations.  All the other analysts in the office who are not on the project are expected to attend these sessions, and they serve as the critiquers of what the project team thinks it has found, as well as the questioners about what type of evidence/information draft findings are based on.
 



Craig Murray, Michigan

We do not require every single line in a report to be referenced, only when there is a key concept, dollar amount or other numerical figure, or other important piece of evidence presented in the report.  The audit supervisor has the primary responsibility for cross referencing the report to the supporting working papers and ensuring that the audit working papers support the conclusions and findings of the report.  Our quality assurance unit will verify all information back to the working papers for findings that we report as material weaknesses.   For findings that are not material weaknesses, our QA unit won't tie back everything in the finding but rather we allow them some discretion to verify only the items that they consider key concepts for that particular finding.   Our QA unit is given 5 work days to read the report and verify the adequacy of the working papers.
 



Judy Randall, Minnesota

The Minnesota Office of the Legislative Auditor has several procedures in place to ensure the accuracy of our reports.  The first step occurs when we are conducting our research.  As we obtain (or create) documents through our research process, we label them and put them in a work paper index.  Through this process, every document receives a letter-number combination that uniquely identifies it.  This indexing system is critical when we verify and source the information in our final reports.  Also during the research phase, team members routinely verify each others’ work, particularly with respect to manipulating and analyzing large data sets.

Near the end of the research phase, the project team (generally comprised of two or three evaluators) drafts a “statement of findings.”  This document is a concise compilation of our major findings and supporting data.  The statement of findings are reviewed by the Legislative Auditor and an independent reviewer—an evaluator that is working on another project who provides a “fresh look” at the findings.  These same people review the draft of the final report.

Before a draft of the report is sent to the relevant agency to review (or at the very least, before the report is sent to the printer), the project team “sources” the report.  This involves identifying the exact work paper (and oftentimes the specific cell in a worksheet) from which an evaluator obtained the data.  (This is where the work paper index mentioned above comes into play.)  All tables, figures, numbers, and key findings identified in the report are sourced.  Some project teams source every sentence in a report, while others focus on the more critical pieces of information.  Finally, the sourcing for tables, figures, and key findings is verified by another member of the project team.

Over the past few years, we have more clearly labeled our work papers and source references.  Previously, we did not identify every work paper document individually.  Instead, we would label a folder that contained several work papers (e.g., all of the interviews with an agency).  Now, with each document containing its own index code, our verification process is more thorough and complete.
 

Editor’s note:  If you’d like to hear more comments on how agencies conduct internal reviews of their findings, this was one of the topics explored in a survey of NLPES member agencies in 2002.  Please see the following site:  http://www.ncsl.org/programs/nlpes/research/survey/internalreview02.htm