NLPES Logo









NLPES Newsletter logo


 

Summer 2013


 

Contents

Chair's Corner

Features

News and blurbs

CHAIR's CORNER

 

Picture of Karl Spock, NLPES ChairChair's Corner
Karl Spock (Texas)


This Chair’s Corner will be my last. My year as chair ends at our NLPES Professional Development Seminar in Austin this Sept. 22-25, and Lisa Kieffer of Georgia’s Performance Audit Division takes over. (Suggesting to Lisa that she move into a leadership role on the NLPES executive committee has probably been my best idea as chair.)
 
These columns have the rare but beneficial effect of causing me to reflect, especially this last one, which brings up memories of beginnings and endings. I started doing program evaluation work way back in 1976 for the Texas Legislative Budget Board. Until shortly before then, program policy evaluation—at least in Texas and probably in many states—had not existed in an organized, ongoing fashion. Most legislative oversight occurred through budgeting or financial auditing. Now, in the later stages of my career, program policy evaluation has evolved into an essential aspect of legislative oversight in various organizational settings across the nation, as one look at the NLPES publication, Ensuring the Public Trust, indicates.
 
NLPES has matured along with program policy evaluation in its various forms. Today, our organization embraces program policy evaluation in all its organizational settings, and provides training, peer reviews, and networking to promote quality evaluation supporting the legislative institution. We have all of you and my predecessors on the NLPES executive committee to thank for arriving at this spot. The trip has not always been easy, I can attest. We get wrapped up in our own states’ busy and demanding schedules, and we have to clear a space for supporting each other outside our required work schedules. Ultimately, the effort is worthwhile. We learn from our shared experiences in our training sessions, and perhaps just as importantly, we also learn that we are not alone in wrestling with vexing problems facing our public institutions.
 
So where does program policy evaluation go from here, as state budgets hopefully improve and legislative environments change? The evolution of our profession and its future will be the subject of one of our panels at the Professional Development Seminar this September. Come join us to gain a perspective on that topic as well as various others, as reflected in our preliminary program. You will also get the opportunity to enjoy some great Texas barbeque and hospitality as you network with your peers across the country.
 
I have been privileged to serve as chair of NLPES this year, and to work with the exceptional membership you elected to the executive committee. Thanks to all of you for the opportunity. I hope to see many of you in Austin for the NLPES Professional Development Seminar. Until then!

 
Karl Spock
Chair, NLPES Executive Committee
July 30, 2013
 



FEATURES


Graphic of Magnifying Glass on a ReportReport Radar
Angus Maciver (Montana)

During these dog days of summer many public servants will take a vacation, go to the beach, and relax a little, but not the intrepid auditors and evaluators of the National Legislative Program Evaluation Society! We relentlessly release audit and evaluation reports despite seasonal distractions! Our Report Radar summer road trip begins with a look at reports addressing transportation issues. Next up is regulation of the oil and gas industry. We will then turn our attention to some more regular, but no less interesting, issues, including taxes, education, and Medicaid. Let the road trip begin!
 
TransportationLike Report Radar, you too have probably pondered who keeps track of all those roadside signs you see while driving the highways and byways. Now we have an answer courtesy of our colleagues in Colorado, who released a report in May looking at the state’s outdoor advertising program. This report is a great read and tells you everything you ever wanted to know about the regulation of roadside advertising and directional signage. You can’t take a road trip to Hawaii, but you can fly there. If you do, you will probably want to pack a copy of the Hawaii State Auditor’s report on contracting issues at the agency managing operations for the state’s airports. If you are staying in the middle part of the country this summer, you may find yourself driving through the state of Indiana, in which case you will benefit from reading an excellent evaluation report released in July by the Office of Fiscal and Management Analysis addressing management and financing of the state’s transportation infrastructure.
 
Oil and gasOf course, none of these road trips would be possible without oil. As the country experiences a mini-boom in the production of oil and gas, attention is increasingly turning to the state regulators responsible for industry oversight. Three of our member offices have recently reported on regulation of the oil and gas industry. In April, our people in Louisiana released an audit report on the State Mineral and Energy Board, with specific emphasis on the state’s process for awarding leases for oil and gas exploration and mineral royalty rates. Our next offering is from the Texas State Auditor’s Office and addresses the Texas Railroad Commission, which actually doesn’t have much to do with railroads, but is the regulator of the state’s oil and gas industry. The report addresses the processes used to ensure oil and gas wells are plugged and decommissioned after production has ended. You may also be interested in a July report from Alabama, which looks at the state’s Oil and Gas Board. This is a sunset-type review and provides some good general information on the board’s operations and regulatory activities.
 
TaxesTaxes are a perennial favorite with Report Radar and we have three excellent reports to recommend. Beginning in Idaho, June saw the release of a report addressing comparisons between business tax policies. This report is presented in the form of a guide for legislators, designed to help them make determinations about how tax changes could affect business activity. The report is accompanied by a nifty web-based tool for evaluating links between tax proposals and related policy goals. If taxpayer compliance issues are keeping you up at night, you should have a look at a March report from the Michigan Office of the Auditor General, which addresses the state’s tax audit and recovery efforts. If property taxation is more your thing, then you should probably read a report issued in July by the Louisiana Legislative Auditor, which discusses state oversight of local property tax assessments.
 
EducationMoving on to education issues, we begin in South Carolina, with a report released in June addressing early childhood education efforts through a program called First Steps to School Readiness. As many states begin focusing more resources on pre-K education, we can expect to see more reports addressing similar issues. Looking more at the business-end of education, we draw attention to a couple of nice products from Kentucky and Kansas. The Kentucky Office of Education Accountability released a report in December 2012 addressing the security of educational data and information systems, and in July the Kansas Division of Legislative Post Audit reported on the results of a survey of school districts on efforts to reduce costs and improve efficiency. We also commend for your attention a pair of reports looking at issues relating to special education services: a March report on special education services in Minnesota provides an excellent overview of the subject matter; and a July report from New Jersey has some interesting stuff on Medicaid reimbursement support for special education services.
 
MedicaidStaying with Medicaid-related issues, we round out this exciting edition of Report Radar. Minnesota had a look at dental services provided through its Medicaid program in March and reported on payment methods and policies. Our friends at the North Carolina Program Evaluation Division also released a Medicaid report in March, which addressed the issue of creating a separate Medicaid department within state government. Finally, those fine people in Montana released a report in June reviewing the state’s efforts to identify and address prescription drug fraud within the Medicaid program.
 

So, maybe you will allow yourself a short vacation this summer?  Here’s an idea: obtain one of these fine audit or evaluation reports and take them along with you. Reading them to the kids in the car is almost guaranteed to reduce conflict and boredom, and that cool beverage will taste so much better with a shot of accountability mixed in. Happy summer, auditors and evaluators!
 


 Picture of South Carolina State CapitolState Profile—Legislative Audit Council
Marcia Lindsay (South Carolina)


Staff
 

We have 16 performance auditors and 3 administrative staff. Our admin staffers have worked at the LAC an average of 32 years – yes, they were all child prodigies!  One administrative staff person noted that when she started, she typed our reports on an IBM Selectric typewriter, whereas she now sends them electronically and posts them to the web.
 
Our audit staff is loaded with MPAs and JDs, with a few political science, art, business, public policy, history, accounting (CPA), Latin, and economics degrees thrown in for good measure. We like MPAs and JDs so much, we even hired one auditor who has both degrees!
 
Like our experienced admin staff, we have several auditors who have gone the distance with us – four auditors have worked with us an average of 25.5 years, the longest being our director and one of our audit managers at 29 years. The two were hired on the exact same day!
 
Half our staff was born right here in South Carolina and we pretty much have drawn other Southeasterners, with a nod to Kansas and to Indiana. Other answers to “In what state were you born?” included “A state of confusion” and “A state of despair.”  We wonder if they ever moved??

 
The worst places we’ve had to audit
 
We polled our staff and found the following:
 
  • USS Yorktown on a hot day. Most of the ship has no air conditioning. Not a pleasant place to be in Charleston in the summer. However, it was also one of the best places I’ve done fieldwork because, despite the heat, it was a history buff’s dream!
  • At one university, they put us in a windowless room in the basement of the administration building with loud pipes running overhead, boiler equipment, and ceilings that looked covered in asbestos.
  • Speaking of asbestos, at one agency, they kindly set up a table for us in the wing closed for asbestos removal!
 
The worst thing ever said to an auditor from an auditee
 
  • “We can’t talk to you unless you submit all your questions in advance to the director of the agency.”
  • “You don’t have the authority to audit me…I don’t have to show you nut’in.”
  • “You have an agenda and are not independent.”
  • During an entry conference:  “It’s nice to meet you, when are you leaving?”
Why we love our jobs!

 
LAC auditors cited enjoyable co-workers to flexibility to never knowing what program you will be auditing next. In one year, we audited our state’s floating naval museum, Medicaid managed care rates, forestry management, and employment and workforce. This year alone we dealt with voting machines, probation, parole and pardon, First Steps to school readiness, and art!  As one auditor put it, “Our days are unpredictable, but that is part of the fun!”

 


 

Graphic of SpotlightsStaff Spotlight
Matt Etzel (Kansas)

 
Since joining the Legislative Division of Post Audit in Kansas, one question my family and friends ask most often is, “How did you come to be an auditor?”  Their confusion over my newly discovered career likely stemmed from the fact that I never considered auditing as a profession. After graduating from the University of Kansas in with a B.A. in sociology, I planned to spend a year studying for the graduate entrance exam to pursue a career as a sociologist. To pay for this year of studying, I realized I would need a job, and began searching for what I considered would a temporary break from academia. What ensued was the experience that led me to become a performance auditor for the state of Kansas.
 
I stumbled across a job posting for a secretarial position at something called the Legislative Division of Post Audit (LPA). The duties seemed in line with some of my previous student jobs at the university, and the pay would support me during my year away from school. I applied, not knowing exactly what I was getting myself into. I was selected for an interview and sent a package containing a few audits and a brochure titled, “What We Do and How We Do It.”  That evening, I read through the material, trying to grasp what the division actually did. After learning some of the basics about the office, I thought it sounded like an interesting experience that could lend a little perspective to life outside of academia. So, fresh from my undergraduate graduation ceremony, I accepted the position of Office Secretary.
 
My administrative duties started in July of that year. After a short time as the division’s secretary, I began to realize that the component pieces of performance auditing were not all that different from many of the elements I enjoyed in sociology. Much of the work seemed driven by familiar qualitative and quantitative methodologies, and the focus on research and thesis-driven writing made me feel right at home. I realized performance auditing might actually be a career I would be interested in pursuing. However, performance auditing for the state requires a master’s degree—something I didn’t have.
 
The following spring, the division lowered the education requirements to a bachelor’s degree to widen the field of suitable candidates. I thought, “Aha—I may actually have a shot at this.”  Sure enough, an auditor position opened up, and I was asked if I was interested in filling it. So, about a year after I began my “temporary” stint with LPA, I started my career as a performance auditor. I always knew that the secretarial position would be temporary; I just thought I would leave it to become a sociologist. Instead, I became an auditor. Some things you just can’t plan for.
 
Since becoming an auditor I have participated in several audits, each different and each keeping me on my toes. In the last two years I’ve examined the operations of the Kansas Neurological Institute, evaluated the appropriateness of mandatory nursing hours in nursing facilities, and worked to identify state surplus property. That may be the aspect I enjoy most:  each project requires me to take another step away from my comfort zone and develop new skills and knowledge so I can understand, evaluate, and report on programs I may have had little experience with before. I can’t say that three years ago I would have guessed performance auditing would become a career option for me, but I am certainly glad it did. I have been given a chance to work in an environment that fulfills my intellectual and professional ambitions—even if I did, well, fall into it.

 
Matt Etzel is a Performance Auditor with the Kansas Legislative Division of Post Audit.
 


 Graphic of a lunch box with foodResearch and Methodology Profile:  Verifying Student Eligibility for the Free-Lunch Program
Katrin Osterhaus (Kansas)

 
In school year 2005-06, Kansas school districts identified about 135,000 students as eligible for free lunches and received almost $111 million in at-risk funds. Because state funding is tied to the free lunch count, we were asked to evaluate whether the free-lunch count accurately reflected the number of students eligible for the program.
 
To answer legislators’ concerns, we reviewed records for a random sample of 500 students determined eligible for free lunches during the 2005-06 school year. This large sample allowed us to project our findings to the state’s total population of free-lunch students with a 95 percent confidence level and precision level of +/- 5 percent. The size of the sample itself was ambitious for us, but it was the method we applied to the sample that made this audit truly unique.  
 
Students can qualify for free lunches by either meeting categorical qualifications (foster care, food stamps, homeless status, etc.) or income qualifications. Our sample contained 203 students who were ‘categorically eligible,’ 296 who were ‘income eligible,’ and one who fell into the sample but whose school district did not have an application on file.
 
We needed to design our work to encompass all possibilities that could lead to an ineligible student qualifying for free lunches. We also needed to ensure we didn’t create false positives—that is, asserting ineligibility when the student actually qualified. Complicating matters was the fact that federal law prevents school districts from requesting additional information about claimed income information presented on the free school lunch application. As a result, we created a decision tree to confirm categorical or income eligibility and used data from multiple state agencies to do our work.   
 
For categorically eligible students, we confirmed their food stamp or foster care status with records at the Department of Social and Rehabilitation Services. For students who became categorically eligible due to a homeless, runaway, or migrant status, we checked independent school district and Department of Education records to confirm eligibility.   
 
For income-eligible students, we reviewed applicable tax returns at the Department of Revenue for each household member on the application. When the sum of members’ income exceeded the income threshold for free-lunch eligibility, we applied a secondary income test using Kansas Department of Labor (KDOL) records. We determined students to be ineligible only if household members’ KDOL total annual wages were higher than the eligibility threshold. Because free-lunch applications are made in September, we also allowed records to count as eligible when household members’ third quarter earnings were below the quarterly income limit.
 
Our approach of cross-checking and triangulating income and program status across a number of departments helped uncover a large non-compliance rate in this program. We found that 85 of the 500 students in our random sample—17 percent—were not eligible for the National School Lunch Program. Specifically, we determined that four of 16 ‘categorically eligible’ students claimed foster care status that was not true. And 80 of 296 income-eligible students were ineligible, primarily because household members under-reported their income. Some household members’ income was left off the application, while others’ was understated. For example, one application recorded household income at $17,680, but we confirmed a household combined income of $57,057 when the household income eligibility limit was $20,917. The errors we identified cost the state an estimated $19 million.  
 
Finally, just when we thought our fieldwork had ended, during the presentation of our results, legislators questioned whether our random sample of 500 free lunch records accurately represented the entire 135,000 population of free-lunch students. Over our lunch, we tested the reasonability by comparing certain demographics (e.g., age, gender) in our sample to the entire free lunch population. Although this was not part of the original audit work, we were able to report to legislators that afternoon that the results of our test work showed the proportions in our sample matched those in the free-lunch population within 1 percent.
 
Katrin Osterhaus is a Principal Auditor at the Kansas Legislative Division of Post Audit (LPA). Report Link

 

 Graphic of pulled state curtainsResearch and Methodology Profile:  Vignettes, Anyone?
Joel Alter (Minnesota)

 
This article describes a research method used in a report titled “Child Protection Screening,” issued by the Minnesota Office of the Legislative Auditor in 2012. Staff on this project included Carrie Meyerhoff (project manager), KJ Starr, and Matt Schroeder. The report was one of two that received the NLPES award for Research Methodology in 2013.
 
One of the most fundamental and difficult responsibilities of government is ensuring the safety of children. Each day, child protection agencies receive calls alleging possible abuse and neglect. Although statutes and state guidance provide a framework for public agencies to “screen in” or “screen out” these calls (and later, for screened-in cases, to make determinations of whether child protection intervention is needed), there is room for interpretation.
 
Minnesota’s child protection system is administered by 84 county and 2 tribal agencies, with oversight by the state Department of Human Services (DHS). In 2011, the Minnesota Legislative Audit Commission asked us to evaluate child protection screening, partly because of concerns about variation in screening decisions.
 
To assess variation in screening practices, we analyzed screening data that child protection agencies had reported to DHS. But staff found much variation in how agencies documented maltreatment allegations; in some cases, allegations were not documented at all. Among counties that had more extensive data, we conducted a regression analysis to explore factors that might explain screening variation. In the end, we concluded that the amount of missing data—in even data-rich counties—precluded us from reaching definitive conclusions.
 
Faced with significant limitations in existing data, we employed several alternative approaches. The most interesting of these was a “vignette survey,” an approach we had not previously used. We wrote a series of ten fictitious cases, each describing a referral to a child protection agency, based on dozens of actual cases we read.

For each vignette, we asked the child protection screening agencies around the state to indicate whether they would “screen in” or “screen out” the described case.  (If a case is screened in, it needs to be assessed or investigated to determine whether child protection intervention is needed.)  We also asked whether the agency would refer the families in “screened out” cases to voluntary services (not provided by the child protection agency) to address issues underlying the referral. All but one of Minnesota’s 86 screening agencies responded to the survey by indicating how they would respond to the situations described in the vignettes.
 
Some vignettes were very frank. For example, one featured a mother who reportedly called her daughter a “fat bitch” and “loser” regularly; another described a father who threatened to “whup [his son] so bad you’ll wish you were dead” and shot the family dog in front of the son. The vignettes included other details for the agencies to consider, but each was intentionally brief. The idea was to see whether agencies would make different screening decisions based on identical information.
 
For some vignettes, agencies showed considerable agreement in their decisions. For instance, 82 percent said they would screen in a case where a new mother tested positive for marijuana shortly after giving birth and had told her doctor about her marijuana use during pregnancy. But in some cases, agencies were very split. In one vignette, 47 percent of agencies said they would screen the case in, while 53 percent said they would screen it out. We solicited written explanations from the agencies regarding their decisions, enabling us to better understand the factors the screeners considered. 
 
We supplemented the information obtained from vignettes with other sources of information.  We visited 11 screening agencies, often observing screening decisions as they were made.  We also conducted statewide surveys of mandated reporters, child protection screeners, and county and tribal human services directors.  In addition, we interviewed a variety of state and local officials, and analyzed administrative data, with all its limitations.
 
Interestingly, after we completed our vignette survey, the vignettes became a springboard for discussions among child protection agencies about screening practices.  For example, the vignettes were discussed at regional meetings of county child protection officials, and the human services director of one county told our staff he intended to use the vignettes as a learning tool for his elected county board.
 
The evaluation concluded that child protection staff generally make screening decisions in a reasonable and deliberative manner, but it confirmed variation in agencies’ decisions.  Comments we received from agencies about the vignettes helped us identify reasons for the variations and parts of statutes or state guidelines needing additional clarity. 
 
Use of the vignette survey was just one component of a multi-pronged evaluation.  However, using a set of vignettes with identical descriptions enabled us to assess screening practices with a degree of control that would not have been possible with administrative data alone.
 
Joel Alter is a program evaluation coordinator for the Minnesota Office of the Legislative Auditor (OLA).  Information on the vignettes is contained in an appendix to the report.
 



NEWS & BLURBS

 

Georgia peach on top of crateNCSL Summit & NLPES Executive Committee News
Rachel Hibbard (Hawai‘i)

 
Thousands of legislators and legislative staff from around the country gathered in steamy Atlanta from Aug. 12-15 for this year’s Legislative Summit.  Hundreds of sessions were held over four days, including plenary sessions featuring Sandra Day O’Connor, Roger Ferguson, Beth Ann Bovino, and David Gergen. Legislative staff were engaged and motivated by Phillip Boyle in a rousing and thought-provoking afternoon session, and honored at a revival of the annual Legislative Staff Achievement Luncheon hosted by outgoing NCSL staff chair Patsy Spaw. Social highlights were the annual Walk for Wellness, the NLPES Dutch treat dinner (held at Italian restaurant Azio Downtown), the Bipartisan Bike Ride, and the “Whale of an Evening” reception at the Georgia Aquarium (one of the largest in the world). Southern food and hospitality permeated the conference, and we all came away fatter and happier than we arrived! Thank you, Georgia!
 
The NLPES executive committee also met during the summit to go over its work since the last meeting in April and prepare for a change of officers at its next meeting, which will be at the NLPES ProfessionalPicture of ferris wheel Development Seminar in Austin in September.
 
Presentation clips and materials from the summit are available on NCSL’s website.

 

 


graphic of blue ribbon2013 NLPES Awards—And The Winners Are. . .
Wayne Kidd (Utah)

 
To recognize exceptional performance among our offices, NLPES offers awards in four categories:
 

  • Excellence in Evaluation Award—is awarded to the office determined to have contributed the most to the field of program evaluation during the period Jan. 1, 2009 tPicture of Colorado Audit Bureau Employeeso Dec. 31, 2012. And the winner is… (long pause, drum roll)… the Colorado Office of the State Auditor (pictured).Excellence in Research Methods Award—are awarded to the office(s) that have demonstrated outstanding use of evaluation research methods in a report released during 2012. And the winners are… (opening two envelopes)… the Minnesota Office of the Legislative Auditor and the Washington Joint Legislative Audit and Review Committee.

  •  Outstanding Achievement Award—is awarded to an individual who has made outstanding contributions to the field of legislative program evaluation. And the winner is… (drum roll)… Marion M. Higa, State Auditor (retired), Hawai‘i Office of the Auditor.
  • Picture of Marion Higa, Ret. HICertificates of Impact—are awarded to offices that released a report in 2011 or 2012 that had documented public policy impacts.  And the winners are… (deep breath!)… Arizona, Arkansas, California, Colorado, Connecticut, Georgia, Hawai‘i, Idaho, Illinois, Kansas, Louisiana, Maine, Michigan, Minnesota, Mississippi, Montana, New Mexico, North Carolina, Tennessee, Texas (State Auditor), Texas (Sunset Commission), Utah, Washington, and Wisconsin.

 Congratulations to all the winners!

 


 
Staff Happenings
 
Shan Hays retired from the Arizona Office of the Auditor General in June after 25 years of outstanding service.  Shan began her employment with the Auditor General’s Office as an entry level auditor in 1988.  She was a manager in the Performance Audit Division for 17 years.  Her invaluable contributions and masterful effect on audits and on the people she worked with will be greatly missed! Shan looks forward to traveling and plans to try her hand at writing a book. 
 
Please let us know if you have staff happenings to share!  Email Rachel Hibbard at rhibbard@auditor.state.hi.us

 


Graphic of "Check it out" verbiage
Online training from the NLPES library:  Survey Refresher— With session adjourned, many of us have begun fieldwork for new projects.  Often, surveys are the best way to collect information that is not readily available from other sources. Visit our NLPES online training library  and click on the PowerPoint Survey Refresher for a quick reminder about how to write, format, administer, and interpret surveys. Check it out!
 
NLPES website—Learn more about NLPES and see what we do—spend a few moments touring our NLPES website. You’ll find general information about the NLPES including by-laws, executive committee membership and subcommittees, state contacts, awards, and information on peer review. We also have a training library and document resources including past meetings, minutes, awards, newsletters, and more.  Come on by and visit for a while!
 
NLPES listserv—Join our NLPES listserv and:

  • Get our NLPES Newsletter delivered directly to your email box

  • Query other states about evaluation work similar to what you’ve just been assigned

  • Receive email announcements about performance evaluation reports from other states!

 
Simply send a blank email to nlpes-l-request@ncsl.org with the subject area blank and the word “SUBSCRIBE” in the body.  You will receive a welcome message if successful (be patient; the response may take a little while). See the listserv link on the NLPES website for additional information such as how to post messages to the listserv and “netiquette” niceties, such as not hitting “Reply All” when answering a query posted to the listserv. You’ll be glad you joined!
 
Legislative careers websiteKnow a young professional thinking about pursuing a career with a state legislature? Point them to the opportunities posted on NCSL’s legislative careers website. Job seekers can explore the various types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs.
 
Launched by NCSL in June 2012, this is a great website.  According to NCSL, attracting young people to work as legislative staff will be increasingly important in the coming years.  And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. 
 
Check out the site’s welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career. 

 


 
graphic of "upcoming events"Upcoming Events
NLPES Professional Development Seminar—The PDS will be held in Austin, Tex. at the DoubleTree Suites by Hilton Hotel—Austin from Sept. 23–25.  See the NCSL website for details. 
 
NLPES hosts the only national training event designed exclusively for state legislative staff who work in program evaluation. This year’s seminar will offer professional development opportunities for program evaluators of all skill levels. Don’t miss this opportunity to meet your colleagues and hear about the innovative research and analytical work program evaluation offices around the country are doing. We hope to see you in the Lone Star State!
 
Look for the Fall 2013 edition of the NLPES Newsletter in November!

 


 

NLPES-NEWS is published three times a year by the National Legislative Program Evaluation Society, a staff section of the National Conference of State Legislatures.  NLPES serves the professionals of state legislative agencies engaged in government program evaluation.  The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
 
2012-2013 NLPES Communications Subcommittee:
Dale Carlson (CA)
Charles Sallee (NM)
Rachel Hibbard, newsletter editor (HI)
 
NCLS Liaison to NLPES:
Brenda Erickson, (303) 856-1391
National Conference of State Legislatures • 7700 East First Place • Denver, Colorado 80230 • (303) 364-7700