NPLES Question of the Month

August-October 2003

WHAT'S YOUR OPINION ABOUT REQUIRING JOB APPLICANTS TO COMPLETE A WRITING EXAM OR WRITING EXERCISE? (Does your office do this? If so, please describe your practices and what your experience has been. If not, please indicate why you haven't used writing exercises.)

Rick Riggs, Kansas

We've used a writing exercise for many years. Each candidate who is invited in for an interview is first given a one-hour exercise consisting of 3 parts. All 3 parts are based on an actual audit we did a few years ago, but the information has been heavily fictionalized.

Part 1 is an "interview summary" with the executive director of the state's cosmetology regulatory agency. At the end, the candidate is asked a series of critical-thinking questions like, "Did any of the Executive Director's comments raise red flags in your mind? Did anything she said suggest possible lines for further investigation?"

Part 2 is a table of inspection data dealing with inspection regions, number of hair salons in each region, how many inspections were completed, etc. The candidate is asked to write a couple of paragraphs explaining what the table shows, demonstrating analysis and writing ability.

Part 3 is a series of spreadsheets dealing with timeliness of license renewals and inspections. The candidate is asked to answer a series of questions that demonstrate spreadsheet proficiency and ability to analyze data.

The exercise is timed, and cut off at 60 minutes. The interviewers are provided with a printout of the exercises and an answer sheet before going in to the interview.

I'm not sure how we'd measure a candidate's writing ability if we didn't do this type of exercise. Asking candidates for a writing sample has the obvious drawback that we don't know its provenance. 


John Sylvia, West Virginia

West Virginia requires applicants who are called in for a second interview to write a response to an audit question. The hypothetical situation asks for the applicant to identify potential causes, effects and list recommendations. 


Barbara Rogers, Wyoming

We want to get many views of the writing and analytical abilities of our applicants. First, we ask them to submit a short sample of analytical writing with their resume, and the variety we receive is enormous: from a half page to a full thesis; from descriptive and procedural to the highly abstract. Recognizing that these sample may have been carefully edited, or even ghosted, by another party, we require semi-finalists and finalists to do more writing during the interview process.

Before the first interview, applicants are given time to write one to three paragraphs about the challenges of research. We want to see them demonstrate an ability to analyze a general problem by breaking it down into sub-components, and then possibly propose some reasonable ways to overcome the challenges they've identified. We also look for the obvious: clear expression of ideas and good grammar.

Before the second interview, they spend an hour on two written exercises, both based on language and concepts from our recent reports. The first presents a page and a half of summary information and asks the applicant to discuss five questions related to that material. This exercise calls for deductive reasoning as well as an ability to see through a (small) smokescreen. The second exercise is a table of data; applicants must make inferences from the data and also state how they would improve the table title and headings to more accurately describe the contents. This exercise tests their ability to see relationships and patterns within a group of figures, and also asks them to assess what may be unclear or misleading about how those numbers are presented.

We're willing to put this amount of effort into preparing and reviewing the exercises because we've learned from the school of hard knocks that a successful face-to-face interview and a couple of good references do not necessarily add up to a good evaluator. All the charm, good grades, past job successes, and fine intentions in the world cannot substitute for a good old-fashioned ability to think critically and express oneself clearly. Not that our process is a guarantee of a good fit, but I think it is helping us to screen out the more obvious misfits. I'm eager to read the ideas from other states because they'll help us continue to fine tune what we've already got going. 


Maria Chun, Hawaii

We found it to be extremely helpful. We have applicants complete a written assessment of a case study on-site. We provide them with a computer and a printer and they have one hour to complete the case study. We used to allow applicants a week to complete a case study, but found that we were unable to get a true sense of the person's writing skills. Quite often, people clearly received help from someone else, sometimes extensively so. 


Ethel Detch, Tennessee

Our office requires the submission of written, analytical document such as a research paper as part of the application, but we don't require a writing exam. We have considered develping one, but haven't so far. 


Byron Brown, Florida

OPPAGA has long required applicants to complete a work exercise as a part of the application process. We find the work exercise provides a necessary piece of information that generally confirms our opinions of a candidate, or in some cases, causes us to think about a candidate differently than our initial impressions.

After our initial screening determines whether an applicant has the background and experience that might be a match with our needs, we invite the applicant into our offices for a 40 minute presentation about OPPAGA and to complete a 90 minute work exercise. We believe that this combination is critical; the presentation gives the applicant a chance to get to know us and what we are about, and helps get their minds into thinking about the kind of work we do.

The work exercise is a 4-page case study, adapted from a real life example of a program we reviewed. We use a topic that is not neither technical nor controversial, thus avoiding areas where an applicant may have prior knowledge, preconceived notions or strong convictions.

We set the applicant up in a vacant office with a lap top computer, and ask that they prepare the response on the computer in Microsoft Word. There are five questions related to the case study that test difference dimensions of the applicant's thinking and skills. For example, there are brainstorming and methodology questions. There is a question asking the applicant to interpret some simple data in a table, and another question asking the applicant to construct a simple table (they can construct the table in Word, Excel, or by hand). Finally, there is an open-ended question that is designed to give them an opportunity to briefly show their writing skill (grammar and logic). The overall process of completing the work exercise also gives us a good idea of whether they organize their work to get it done within a limited time, and whether they follow directions.

The writing exercise is mainly valuable as a 2nd opinion on the opinions that we form while reviewing the person's application package or while interviewing a person. A person with minimal training and experience who does a mediocre job on the exercise would be eliminated from further consideration. However, if an applicant performs much better on the work exercise than we would have expected based upon their experience and training, we will look very closely at that applicant. When a person with relevant training and experience performs worse than we would have expected, it does not eliminate the applicant from consideration, but again causes us to look closely.

We use the work exercise as an alternative to any kind of writing sample. We found no value in writing samples, since we have no control over the conditions in which they are completed. For a period of time, we tried a group exercise format, in which we would invite 5 or 6 applicants to come in at the same time and discuss a case study together. We abandoned this idea after we realized how much variance in these groups could be created by the particular group of individuals in the group. 


Sandy Ronayne, Colorado

The Colorado Office of the State Auditor has used written exams for applicants for many years. The exam practices have changed. Many years ago, the Office used long multiple choice exams as a screening tool. Oral exams were also structured. Several years ago, the Office changed to requiring some applicants to complete a written, take home case study after their initial interviews. Recently, the Office has developed two sets of exams (one for performance and one for financial) for applicants to take before they are interviewed. The exams are given in the Office-applicants have one hour to complete the questions. If the applicant receives a passing grade on the exam, he/she is invited in for an interview. Interview questions are also becoming more structured and hopefully more consistent among the interviewers.

By using the standard written exams, we hope to be able to better identify applicants who have basic analytical and writing skills and will be successful in performing audit work. 


Frank Luera, California Bureau of State Audits

Our writing assessment is among the fundamental components of our overall hiring process. Because our reports are the bureau's primary products, we look for candidates who possess strong writing skills. It, therefore, makes sense that we test for these skills.

Currently, we only assess candidates who are successful on an initial oral interview. We issue these candidates a password so that they can complete the writing assessment online. They are given two hours to complete the writing assessment starting from the time they open the writing assessment instructions. We also allow candidates to complete the assessment by fax or at our office, if they prefer.

The writing assessment instructs candidates to analyze a particular audit situation/scenario. To provide the candidates with an understanding of the assessment's objectives, we give them an overview of the characteristics of a strong essay. These characteristics are the basis for the scoring criteria.

After the candidates submit their assessments, we have professional editors score them. The editors are part of a pool of editors we maintain to edit our reports. Because they are familiar with our writing style, the editors can assess the strengths and weaknesses of the candidate's writing skills. The management team then uses the editor's evaluation in considering which candidates to invite for a final interview. Although there are other considerations, the writing assessment is an important factor in the final hiring decision. 


Karen Latta, Texas Sunset Commission

At the Texas Sunset Commission, we require all applicants that we interview to complete a writing exercise. We have found the exercise to be very useful in our selection process. After the interview, we put the applicant in front of a computer with the writing exercise and give him or her one hour to complete it. The exercise, which we developed several years ago, is intended to evaluate an applicant's ability to make a recommendation and support it based on raw information about a specific topic and to complete a task in a short amount of time. The exercise also tests an applicant's raw, unedited writing ability. We find the exercise more useful than requiring writing samples from applicants because writing samples are often heavily edited by other people and may not represent a person's true ability. 


Perry Simpson, South Carolina

We do have applicants complete a writing exercise. It is an exercise we obtained from the Arizona Auditor General's office. We allow applicants an hour to review data and develop conclusions and recommendations based on the data We set up a computer and office for the applicants to use. The exercise includes a reviewer's evaluation form which is completed by our audit managers. We have found the exercise useful in getting an idea of an applicant's analytical skills and their ability to organize information. We would be happy to provide a copy of the exercise should anybody need one. 


Sharon Robinson, Louisiana

Yes, we do administer a writing exercise following a one-hour interview. Each candidate for a performance audit position is given an audit scenario and six questions that relate to that scenario. They are allowed one hour to hand write their responses. We use the writing exercises to gauge the following:

  • Basic writing skills (grammar, sentence structure, punctuation, etc.)
  • Logic/reasoning skills
  • Ability to organize thoughts and data in a meaningful way on paper
  • Comprehension of basic performance issues

We do not assign a grade to the exercises; we simply make note of the above items and consider them along with our impressions gained during the interview. 


Dennis Wilson, Texas State Auditor's Office

The Texas State Auditor's Office (SAO) has used a Writing Sample assessment in the auditor hiring process for approximately 4 years. We have found the inclusion of this assessment to provide valuable information in our hiring decisions.

The SAO assessment asks candidates to write a summary analysis of the SAO interview/selection process. Candidates are rated based on:

  • Analysis -Did the candidate take observations from the interview process and consider the interconnections and relations between these observations? Were multiple concepts combined to form original thoughts? Did the candidate try to explain the logic behind the process vs. simply describing the individual steps in the process?
    • Organization - Was the writing sample easy to follow? Did it have a clear introduction, body, and conclusion? Were ideas expressed in a clear easy-to-follow format?
    • Accuracy -Did the candidate accurately recall names, spelling of names, duration of exercises, office numbers, information from the job preview, interview questions, etc, that were part of the interview process?
    • Sentence Structure/Grammar/and Word Use - Did the candidate structure sentences according to standard conventions of grammar and punctuation? Based on the context of the writing sample, did the candidate make appropriate word choices?

Candidates are given 30 minutes to prepare a writing sample that is no longer than two double-spaced, typed pages. They may refer to any notes or materials acquired during the interview process. 


Tricia Bishop, Virginia

As writing is a core and essential job function for research analysts, the Joint Legislative Audit and Review Commission requires all applicants to complete a writing exercise which is administered on-site. During the short (approximately 45 minute) exercise, applicants are asked to respond to a general question. In reviewing responses to the question, JLARC looks for proper grammar, logic, organization, and appropriate mechanics. 


Kim Hildebrand, Arizona

Our office is in the process of reviewing and revising our recruiting process but here is what we currently do. We require the applicant to provide no more than one page each describing their experience in a) research and analysis b) teamwork and c) writing. This gives us some sense of their writing skills. We also ask them to submit a writing sample of no more than 10 pages-preferably something that takes a complicated topic and presents it in an easy to understand manner. However, we are considering eliminating the 10-page writing sample since we really don't know if the applicant is the one who actually wrote it.

We also have two case exercises the applicants are required to complete at home and send back to us. One case example provides limited information on a specific state agency and asks the applicant to describe who they would talk to, what records they would review,and what analysis they would conduct to evaluate this agency's performance. The second example is called our data analysis exercise. In this scenario, the applicant is given some background information on a topic of interest to the legislature and then 12 separate pieces of information that have been gathered in relation to the topic. The applicant is asked to analyze the information and describe what messages the information is presenting to them.

Currently, we are reviewing our entire recruiting process-from how many interviews we hold to how many exercises we want to require of an applicant, when these should be given in the process, and whether we should bring them in-house rather than allowing the applicant to do them at home. We are leaning towards having the applicant do the exercise in house-but cutting the exercise down to one rather than two. We also currently don't incorporate the exercise into our interview process but are considering that option also; i.e. to ask the candidate about why they would go about doing something the way they described or how they specifically came up with the "message" they came up with. 


Joel Alter, Minnesota

We have not used a writing exercise as part of our process for screening job applicants. We have typically asked people we've interviewed to leave us with a sample of something they've written-and we recognize that there are limitations to using these samples as a measure of applicants' writing skills.

I think we would be open to considering the types of writing exercises used by some other NLPES offices, but perhaps the logistics have seemed a bit complicated-adding 30 to 60 minutes to the time required for the interview(s), finding workspaces for the applicants to complete the exercise, having overlapping schedules (some applicants being interviewed while others are doing the writing exercise), etc.

And, in general, we've been pleased with the people we've hired through our present process. Our process has consisted primarily of interviews of applicants by groups of our staff, and typically there are multiple interviews of an applicant before we make an offer. Our hope has been that the applicant's analytical skills and personality will emerge in the interviews, even if some of the technical skills (including writing) are not demonstrated first-hand.