Chair's Corner
All hands! | Jon Courtney (New Mexico)
What a difference a few months and a pandemic make. Some of you are back at work, but I’m sure many of you are reading this while still working from your home office. I am working from my makeshift home office this morning (I have been using an old piano as my desk), while my daughter is in the next room talking with her fifth-grade class over Zoom. Life is different, there is just no getting around it. Many people are understandably in a somber mood and the typical office camaraderie that might lift our spirits is hard to find. But this is not a time to be downtrodden, rather I see it as a call to action for our field.
What are you seeing from your desk, wherever it might be today? Government operations and schooling have been disrupted, revenues are falling, budgets are being cut, and conferences, including our own PDS, are being postponed or canceled. Right now, your job is more important than ever. We all need to ensure that programs are still working, that services are still being delivered, and that our resources, now more precious than ever, are not subject to fraud, waste or abuse. We also need to share good ideas and insights as we face new challenges that COVID brings to government operations. This might look different in each of our states. Let me tell you a bit about how we are responding to the pandemic New Mexico and at the Executive Committee.
In New Mexico, we have pivoted much of our work plan to doing shorter rapid response type evaluations on topics that are often related to COVID. We are calling these reports “Policy Spotlights.” These projects have the advantage of being less resource-intensive for fieldwork, while still providing us with the opportunity to provide policymakers with timely good information about topics, including learning loss and returning to school, safely reopening the economy, job losses and resources for recovery, and broadband availability, among others. We also identified a number of issues with emergency procurements at the onset of the pandemic and have worked with multiple agencies (including our executive agencies and elected state auditor) to issue an alert on emergency procurements, and to form a COVID Accountability Workgroup where risks, best practices and lessons learned can be shared. We have also been sharing these experiences through the listserv and a recent NLPES information-sharing call.
Speaking of NLPES…
In our NLPES Executive Committee, we have been working to identify and provide information on professional development opportunities in the absence of our PDS this year. Keep an eye out for continued emails about these opportunities, there are several good trainings over the next few months. We recently had a successful information-sharing Zoom call regarding COVID-related matters where a number of topics were discussed, including the challenges our offices are facing due to the pandemic. We are also working to participate in the Staff Hub from NCSL, which you should see more information on soon. Additionally, we plan to hold several more Zoom calls and webinars in the coming months. Finally, we are in the early stages of putting together a work group for methodologists. Look for more information on that soon.
As my year as NLPES chair comes to a close, I am filled with hope and appreciation. We are living in challenging times, but I am seeing our member offices rise to that challenge. If there was ever a time where there has been a need for accountability, research and data to guide our policymaking and decision-making, that time is now. Our research, our presentations and our reports can help to bring some certainty to a world where so much seems uncertain. As for me, I am spending the rest of my day preparing for a committee meeting in beautiful Red River, N.M., later this week, where our members will continue to address the needs of our state as best they can using the most up-to-date objective information our staff can provide, I wish each of you the best of luck in doing the same.
Jon Courtney is the 2019–2020 NLPES Executive Committee chair. He can be reached by emailor (505) 986-4539.
Research and Methodology
Excellence in Research Methods | Casey Radostitz and Joshua Karas (Washington)
Two offices received the Excellence in Research Methods Award in 2020—Washington Joint Legislative Audit and Review Committee and New Mexico Legislative Finance Committee. This is the first of a two-part review of the winning reports and the methodologies applied by each office.
The Legislature directed Washington JLARC staff to evaluate the Washington State Opportunity Scholarship (WSOS), which provides up to $22,500 over five years to low- and middle-income students pursuing bachelor's degrees in the high-demand science, technology, engineering and mathematics (STEM) and health care fields. To do this, we sought to quantitatively compare the outcomes of Opportunity Scholars with other students who did not receive the scholarship. We considered several approaches for developing a comparison group. For example, we could have compared Opportunity Scholars to all other college students, or all students receiving financial aid. However, the scholarship targets low- and middle-income students pursuing specific majors, and literature indicates that outcomes may differ based on these income and degree factors.
Ultimately, comparing Opportunity Scholars to other students who met the six WSOS eligibility requirements but did not receive scholarships provided the most robust evaluation. Identifying students who met the first five requirements was straightforward. The data we collected had all necessary information about residency, high school graduation, family income, GPA, and financial aid applications. Identifying students who met the sixth WSOS eligibility requirement—majoring in STEM or health care—was more challenging, as shown in the table.
Option
|
Challenge
|
Use recorded major
|
Incomplete or outdated data
Many students not yet declared
|
Use number of STEM or health care classes taken in first year
|
General education requirements include introductory STEM classes that are closely associated with non-STEM degrees
|
Use class data from STEM graduates to predict majors of current underclassmen
|
Requires modeling and machine learning of classes for over 110,000 individuals (13.7 million records)
|
We opted for the machine learning approach because it would most accurately allow us to identify students who met the eligibility requirements but did not receive this scholarship.
Identifying graduates and grouping classes
We started with existing data that includes information about Washington’s college students and wrote R scripts to do the following:
- Flag graduates as STEM or not STEM based on WSOS’s list of approved majors.
- Merge the STEM flag into student class-level data. We cleaned and standardized class title names across institutions, and then calculated the number of STEM and non-STEM graduates who had taken each class during their first year.
* As an example: 93% of graduates who took electromagnetism in their first year graduated with a STEM degree. However, only 29% of graduates who took general psychology in their first year earned a STEM degree.
* This means that electromagnetism would likely be a good indicator that a student is pursuing a STEM degree, whereas general psychology would be a poor indicator.
- Group classes based upon the percent of students who took the class and earned a STEM or health care degree with k-means clustering, an unsupervised machine learning algorithm. We determined that four groups were sufficient to minimize the total within-group sum of squares. Each group consists of classes with a similar proportion of STEM graduates.
Building the model
We then used scripts to count the number of classes each graduate took from each group and developed a generalized linear model (GLM) with a logit link function.
The STEM flag was modeled as a function of the number of classes in each of the groups. The parameter estimates from the model show that classes in groups 1 and 2 are negatively associated with students pursuing a STEM degree and groups 3 and 4 are positively associated.
Testing the model
Next, we needed to develop a threshold to classify students as STEM or non-STEM. Values above the threshold would be flagged as STEM and values below would be flagged as non-STEM. The threshold affects the model’s accuracy, sensitivity and specificity:
- Accuracy represents what proportion of graduates were correctly identified as STEM or non-STEM.
- Sensitivity represents what proportion of non-STEM graduates were correctly identified as non-STEM.
- Specificity represents what proportion of STEM graduates were correctly identified as STEM.
We used the receiver operating characteristic (ROC) curve and the Youden’s index to determine an appropriate threshold. Staff then used cross-validation to determine how many graduates in the testing set were correctly identified. The threshold correctly identified 82% of graduates in the testing set, 82% of the non-STEM graduates in the testing set, and 81% of the STEM graduates in the testing set. The model was also tested on Opportunity Scholars who have graduated. Eighty-eight percent of Opportunity Scholar graduates were correctly identified, 95% of Opportunity Scholar STEM graduates were correctly identified as STEM, and 51% of Opportunity Scholar non-STEM graduates were correctly identified as non-STEM.
Applying the model
This model was used to identify students who qualified for the sixth eligibility requirement (pursuing a STEM degree) and place them into the peer group. With this peer group, we compared the affordability, retention, graduation and employment outcomes of Opportunity Scholars to similar students. Our analysis showed that Opportunity Scholars paid less out of pocket, borrowed less and were more likely to return to school than their peers. This suggested the scholarship was meeting the Legislature’s goals.
To read the full report, please see: http://leg.wa.gov/jlarc/reports/2019/OppScholarship/f_ii/default.html
Casey Radostitz and Joshua Karas are research analysts with Washington’s Joint Legislative Audit and Review Committee (JLARC).
Report Spotlight: Focus on an Award-Winning Certificate of Impact Report
Editor’s note: If your office would like to highlight its award-winning report in the Working Paper, please let eric.thomas@leg.wa.gov know.
California State Auditor Finds Significant K-12 Funding Is Not Benefiting Students as Intended to Close Achievement Gaps | Jordan Wright (California)
With nearly 6 million students in the K‑12 grade levels in public schools, California provides almost $99 billion each year to local school districts. In 2013, the California Legislature made a historic shift in the way it funds K‑12 education by implementing the Local Control Funding Formula (LCFF) to provide more local control over the spending of state funding and to improve educational outcomes for certain groups. LCFF provides districts base funds as well as supplemental funds calculated according to the proportions of students they serve who are English learners, youth in foster care, and students from households with low incomes (intended student groups). However, certain student groups continue to have poorer educational outcomes in comparison to students overall (achievement gaps), and California consistently ranks below the national average on reading and mathematics scores. Because one of the goals of LCFF was to address achievement gaps among intended student groups, the Legislature asked the state auditor to review three large school districts’ LCFF funding and measurement of educational success. However, in addition to the three districts, the team determined that to adequately answer the Legislature’s questions about LCFF, they would need to review the State Board of Education, Department of Education and three county offices of education.
In reviewing the complex laws and regulations governing LCFF, the audit found that the state’s approach to LCFF has not ensured that funding is benefiting the student population in closing achievement gaps. Specifically, the team found that oversight responsibilities fall almost entirely on local entities, and this leaves the state without adequate information to assess the impact that the billions of dollars in funding has on the educational outcomes of the intended student groups. The team found two main problems with LCFF:
- The state does not explicitly require districts to spend all the supplemental funds on the intended student groups or track how they spend those funds.
- Districts can treat unspent supplemental funds in a given year as base funds in the following year and use those funds for general purposes…even if those funds do not directly serve the intended groups.
The team also found that the state deferred full implementation of the supplemental funding formula, and this led the three districts reviewed to identify more than $320 million since fiscal year 2013-14 as being part of their base funds rather than the supplemental funds to be spent on the intended student groups, which likely delayed improvements to educational outcomes.
Finally, the team found that districts’ local control and accountability plans (LCAP)—a three-year spending plan that describes the district’s annual goals, services and expenditures of LCFF funds—were hundreds of pages long but did not always include clear information regarding their use of the supplemental funds. This lack of clarity left policymakers and stakeholders with inadequate data for assessing the impact of those funds on the educational outcomes of the intended student groups.
The audit made recommendations to the Legislature and the state Board of Education focused on increasing transparency and ensuring that intended student groups receive the maximum benefits from the funding.
The team recommended the Legislature do the following:
- Require districts to specify the amounts of budgeted and estimated actual expenditures of the supplemental funds for each service that involves those funds.
- Require districts to identify any unspent supplemental funds annually and specify that those unspent funds at year-end must be used to increase and improve services for intended student groups.
- Require the California Department of Education (CDE) to direct districts to track and report the total amount of supplemental funds they receive and spend each year, and to track the types of services on which they spend those funds to provide additional data for the state and other stakeholders and align spending information with the student outcomes.
To the state board, the team recommended the following:
- Change the LCAP template to require districts to include analyses of the effectiveness of individual services.
- Revise the instructions for the LCAP template to include key information about how districts can successfully demonstrate that they directed spending for services toward intended student groups, and instruct districts to provide sufficiently detailed descriptions of services in their LCAPs.
The audit report prompted quick corrective action. Within 60 days of issuing the report, the state board had addressed the recommendations. Additionally, the team briefed two legislators on the audit results, along with several legislative staff and education lobbyists. Within a few months, legislative members introduced two bills that would address the recommendations. These bills are under review by the State Assembly.
The audit report also generated significant public interest. The State Auditor received over 14,200 online requests for the report through April 30, 2020. The audit also received considerable exposure from the press, including news articles from policy outlets CalMatters and EdSource, and spurred editorials in the Los Angeles Times and San Jose Mercury News. Finally, the team participated in numerous briefings and forums with various stakeholders from nearly 15 interest groups. The president of one nonprofit educational organization referred to the audit as a “wakeup call that identifies serious control deficiencies that lawmakers need to address immediately.”
Jordan Wright is a senior audit evaluator for the California State Auditor’s Office and was audit team leader.
Office Spotlight
Virginia Joint Legislative Audit and Review Commission (JLARC) | Erik Beecroft (Virginia)
Over its nearly 50-year history, JLARC’s research has had a significant impact on the efficiency and effectiveness of Virginia’s government. In the past four years, for example, JLARC evaluations have led to overhauling the state’s foster care system, retooling Virginia’s economic development authority, and designing a regulatory and licensing structure for the introduction of casinos and sports betting in the state. JLARC studies have transformed the pension system for state employees and teachers to preserve it, and, several decades ago, established the formula still used today to allocate state funding to Virginia’s 130 local school divisions. From 2016 to 2018, 77% of more than 300 JLARC recommendations were fully or partially implemented by the General Assembly and state agencies, and those recommendations saved Virginia an estimated $104 million. (Estimates for 2019 are not yet available.) Since JLARC’s inception, recommendations implemented as a result of the agency’s research have saved the state’s taxpayers well over $1 billion. JLARC’s ability to effect improvements reflects the influence of its legislative members, who include the senior leadership of the House and Senate—the House speaker, the chairs of the House and Senate finance committees and other experienced legislators.
The cornerstone of JLARC’s long-standing success is its people, which currently includes 27 research staff (all with graduate degrees), a publications editor, a graphics editor and two administrative staff. A typical JLARC program evaluation is a yearlong, full-time effort by a team of two to four analysts and a project leader. Study teams typically conduct more than 100 interviews with stakeholders, in addition to conducting site visits, several surveys, literature reviews and extensive analysis of detailed quantitative data. JLARC studies involve intensive collaboration by the team, which leads to a quality that is greater than the sum of the individual efforts. Project leaders, who have six to more than a dozen years of experience as JLARC analysts, guide the scope and content of a study and enable analysts to produce their best work.
Study teams are supported by a rigorous quality control process that borders on the obsessive. Associate directors, each with at least 20 years of research experience, typically oversee three studies per year, working closely with team members and guiding each study. JLARC’s project review team, which includes the director, the three associate directors, the methodologist, the publications editor, and graphics designer, examines five deliverables for each study: the scoping document shortly after the study begins, the detailed research work plan, the preliminary findings, the draft report and the draft presentation to the legislative members. The rigorous review ensures that the analysis approach is sound, the findings are unbiased and based on strong evidence, the report is clear, and the recommendations are well supported and feasible.
JLARC’s staff director, Hal Greer, started as an analyst at JLARC a quarter-century ago and is the longest-serving staff member and a driving force for the agency. He leads by listening and valuing the viewpoints of every staff member and by sharing his passion for making government work better for its citizens. Although COVID-19 has created new challenges for JLARC’s work processes,Greer continues to emphasize that the agency’s effectiveness depends on maintaining its core principles of integrity, rigor, objectivity, and nonpartisanship.
Erik Beecroft is the Methodologist at the Virginia Joint Legislative Audit and Review Commission.
Work in the Time of COVID-19
It goes without saying that COVID-19 has forced many of us to change how we work as auditors and program evaluators. This month, we reached out to NLPES offices to learn how staff maintain social interaction and conduct public meetings under COVID-19 restrictions.
While most work-related meetings and interviews can be conducted virtually through Microsoft Teams or Zoom, maintaining the camaraderie we shared with colleagues in physical offices presents a challenge—how do you pull off informal social interaction in a virtual environment? Responses from offices around the country included the following approaches to maintaining the office social environment:
- A mental health Zoom channel to share family pictures, funny videos and other items of personal interest.
- Daily emails from the agency director with protocols for entering the physical office, weather forecasts, entertainment ideas and a comic strip.
- Virtual coffee chats, happy hours, trivia and bingo.
- Virtual wellness sessions, including stretching and yoga.
- Sharing recipes, books and movie recommendations, and photos of home offices and pets.
- Continued acknowledgment of staff life events, such as birthdays, work anniversaries or addition of a new family member.
Our offices have also dealt with the challenge of holding public meetings with work-from-home restrictions in place. Many offices have held meetings virtually or adopted new policies for in-person meetings. Survey responses indicate a variety of approaches and lessons learned:
- Meetings held remotely though Zoom require additional preparation but generally have gone better than expected.
- Holding multiple meeting rehearsals helps troubleshoot technology issues.
- Responding to questions and taking public testimony through a videoconference requires addressing phone and video access and audio/visual issues.
- In-person meetings require masks, social distancing and other safety precautions, such as locating meetings in a larger room to accommodate space between participants.
- Hybrid meetings allow participation by phone, videoconference and in person.
If you are interested in receiving a copy of the survey responses, please email Eric Thomas.
News & Snippets
Staff Happenings
A Founding Father Retires…
What is a “founding father?” According to the dictionary, a founding father is a person who starts or helps start a movement or institution. What pops into your mind when you hear those words? My guess is images of John Adams, Ben Franklin, Alexander Hamilton, Thomas Jefferson, James Madison, George Washington or someone like that.
For folks in the legislative program evaluation world, however, another image may come to mind. John Turcotte is a living legend in the field of legislative program evaluation. Turcotte was a founding member of the National Legislative Program Evaluation Society. He served on the first NLPES Executive Committee and in 1978, became its fourth chair.
During a career that has spanned more than four decades, Turcotte has directed three legislative program evaluation offices—Mississippi PEER (18 years), Florida OPAGGA (7 years), and North Carolina PED (13 years). In addition, he played a key role in the creation of OPEGA in Maine. He has overseen the production of nearly 1,000 reports. He has participated in four NCSL-NLPES peer reviews. Turcotte served as the staff chair of the National Conference of State Legislatures (NCSL) in 1994. In 2004, he received the NLPES Outstanding Achievement Award.
NLPES is not the only organization to recognize Turcotte's work. He was selected as a Henry Toll Fellow by the Council of State Governments. He received the Herman Glazier Award as Public Administrator of the Year from the Mississippi Chapter of American Society for Public Administration (ASPA) and was named a Public Administrator of Yearby the Florida Chapter of ASPA.
And if all that is not enough, there are Turcotte’s “other” careers:
- High school math and science teacher.
- Political science instructor at Hinds Community College.
- Adjunct professor of public administration at Millsaps College.
- CEO of Turcotte Public Administration Consulting and Training, LLC (TPACT).
Plus, believe it or not, Turcotte also has a life outside of work. He spends his time with his wife, Terry, and whenever possible, visits with their grown daughters. He likes to tinker with electronic gadgets and investing.
Who knows what he may take up or what things he may accomplish during the next phase in his life!
Happy Retirement, John!
And More Retirements to Announce…
Greg Hager, Kentucky
Greg Hager retired from legislative service on July 31, 2020. Hager worked for the Kentucky Legislative Research Commission for 20 years. He served as the committee staff administrator to the Program Review and Investigations Committee for 18 years.
During Hager’s final committee meeting before his retirement, Senator Danny Carroll, the co-chair of the Program Review and Investigations Committee, presented a Distinguished Service Award to him.
Professionally, Hager took his work very seriously and never stopped trying to improve Kentucky’s research process and product. For more than a decade, he served as Kentucky’s state contact to NLPES, illustrating his willingness to help other legislative offices, too.
Personally, he has a wickedly intelligent sense of humor. He loves to garden and hopes to become a master gardener.
Here’s to many new adventures, Greg!
Joe Murray, Montana
Joe Murray retired at the end of August 2020.
Murray joined what was originally the Office of the Legislative Auditor in April 1988. Back then, the performance audit team was still in the early stages of its development, and Murray was part of a core group of staff that firmly established the performance audit function as a key part of the Legislature’s accountability mission. Mirroring national trends, the performance audit function has grown in prominence over the years, largely thanks to the dedication and diligence of people like Murray.
After 32 years of service, Murray’s retirement marks the end of an era for the office, but he leaves behind a legacy of commitment to the values of our profession that will endure. His unfailing good humor, capacity for leading people, calmness in the face of adversity, and keen sense of what legislators want and need to know will be much missed.
Happy retirement, Joe! May it be long and filled with fun.
Moving on Up…and Other Changes!
Well, you’ve just learned about a few departures. Now, it’s time to talk about other changes that are happening for NLPES members.
Kiernan McGorty is taking over the reins of the North Carolina Program Evaluation Division. She was named PED’s acting director in September 2020, following John Turcotte’s retirement. McGorty has worked for PED since its inception in 2007. Until her promotion, she had been managing principal evaluator and legal counsel for PED. McGorty received her Ph.D. in cognitive psychology and her J.D. from the University of Nebraska at Lincoln, and she has a bachelor’s degree in psychology from Davidson College.
Gerald Hoppman is the new committee staff administrator for Kentucky’s Program Review and Investigations Committee. Hoppman has 27 years of audit experience with federal and state agencies, including the legislative branch of government. He has a master’s in public administration and bachelor’s degrees in political science and administration of justice.
And let’s welcome two new legislative audit/evaluation offices. In 2019, the Maryland General Assembly enacted legislation establishing the Office of Program Evaluation and Government Accountability in the Department of Legislative Services; Michael Powell is the office’s director.
The Oklahoma Legislature created its Legislative Office of Fiscal Transparency in 2019, too. LOFT is led by Executive Director Mike Jackson and Deputy Director Regina Birchum.
Congratulations to Kiernan and Gerry, and welcome to Michael, Mike and Regina!
2020 NLPES Election Results
The COVID pandemic did not stop the 2020 NLPES election. You cast your votes, and the results are in. Kristen Rottinghaus and Eric Thomas were reelected. The new members are Jennifer Sebren and Darin Underwood.
Please welcome your 2020-2021 NLPES Executive Committee!
- Erik Beecroft, Virginia
- Patricia Berger, Pennsylvania
- Jon Courtney, New Mexico
- Emily Johnson, Texas
- Mary Jo Koschay, Michigan
- Kiernan McGorty, North Carolina
- Paul Navarro, California
- Kristen Rottinghaus, Kansas
- Jennifer Sebren, Mississippi
- Shunti Taylor, Georgia
- Eric Thomas, Washington
- Darin Underwood, Utah
Their term begins in October. Kiernan McGorty will become the new NLPES Chair, automatically moving up from vice chair. The other officers will be elected at the committee’s first meeting.
And the Recipients Are...
In 2020, NLPES recognized the outstanding work of 30 legislative audit offices or staff. Please give a round of applause to this year’s recipients.
- Outstanding Achievement Award—Retired Utah Legislative Auditor General John Schaff
- Excellence in Evaluation Award—Virginia Joint Legislative Audit and Review Commission
- Excellence in Research Methods Awards—New Mexico Legislative Finance Committee and Washington Joint Legislative Audit and Review Committee
- Certificates of Impact—Arizona Office of the Auditor General, Performance Audit Division; California State Auditor’s Office; Colorado Office of the State Auditor; Georgia Department of Audits and Accounts, Performance Audit Division; Hawaii Office of the Auditor; Idaho Office of Performance Evaluations; Illinois Office of the Auditor General; Kansas Legislative Division of Post Audit; Louisiana Legislative Auditor; Maine Office of Program Evaluation and Government Accountability; Michigan Office of the Auditor General; Mississippi Joint Legislative Committee on Performance Evaluation and Expenditure Review; New Mexico Legislative Finance Committee; Office of the New York State Comptroller; North Carolina Program Evaluation Division; Oregon Office of the Secretary of State, Audits Division; Pennsylvania Legislative Budget and Finance Committee; South Carolina House Legislative Oversight Committee; South Carolina Legislative Audit Council; South Dakota Legislative Research Council; Tennessee Comptroller of the Treasury, Office of Research and Education Accountability; Utah Office of the Legislative Auditor General; Virginia Joint Legislative Audit and Review Commission; Washington Joint Legislative Audit and Review Committee; West Virginia Post Audit Division; and Wisconsin Legislative Audit Bureau.
And let’s not forget to this year’s Awards Subcommittee members and award judges—Kiernan McGorty (N.C.), Mary Jo Koschay (Mich.), Sean Hamel (N.C.), Gina Brown (La.), Kate Shiroff (Colo.), Erik Beecroft (Va.), Matt Etzel (Kan.), Edward Seyler (La.), Maria Garnett (Va.) and Adora Thayer (N.C). Without your help, this terrific annual event couldn't occur. Thank you!!
A New Look for NLPES
NLPES has not had a new logo in a decade ... or two .. .or maybe even longer. So, the NLPES Executive Committee decided it was time to refresh NLPES's image. But that takes a lot of consideration. For example:
- Should the logo contain the society's full name or just its acronym?
- Should the year of NLPES's creation be included?
- Should there be a tagline?
- What colors should be used?
- And the list of questions went on ... and on ...
Staff from the California State Auditor's Office came to the rescue, designing numerous logo prototypes. Several polls of the Executive Committee were held to winnow down the selection. At long last, the Executive Committee took a final vote. And thus, NLPES got a new look!
Websites, Professional Development and Other Resources
Website staff photos—Take a look at the NLPES homepage for the new photo ribbon featuring some of your colleagues expressing why NLPES is important to them.
Report Library—The Report Library is up and running. You should have received a listserv notice about this new resource, as well as a survey asking for recommendations for improvements. If you have not already, please visit this great new resource.
NLPES listserv—The NLPES listserv is an email discussion group for NLPES members. Authorized listserv users can query other states about evaluation work, receive announcements about performance evaluation reports and job opportunities from other states, and are notified when the latest edition of this newsletter is available! To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, legislative audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv. See the NLPES listserv website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.