The Working Paper is the official newsletter of the National Legislative Program Evaluation Society. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Time flies—everyone says that. It seems particularly true when you’re enjoying something, or when you’re really busy. Maybe that explains why this year has sped by so quickly!
I hope you have seen evidence of how busy your Executive Committee has been and that you have benefited from all their work. For instance, we have an updated and more user-friendly website and new additions to our training portal. We reached out to member offices in an effort to ensure all are aware of the resources NLPES can provide. And we completed another awards season, allowing us to highlight the accomplishments of our member offices and their ever-increasing impact on state government operations.
We are now coming up on one of the most important events of our year—the 2014 Professional Development Seminar. Our friends in the North Carolina General Assembly’s Program Evaluation Division are hard at work planning a stellar seminar. The conference will focus on building skills and will include hands-on training sessions covering Excel Tips and Tricks, as well as sessions on using statistical software and legal considerations for audits and evaluations. Plan to join us in Raleigh, North Carolina from October 6-8! Registration is still open, but the cutoff date for hotel reservations was September 8, 2014.
As I conclude my term as chair, I want to tell you all what a pleasure it has been to work with you and to witness first-hand your commitment to developing and advancing this profession. The energy and commitment your Executive Committee members bring to their tasks is inspiring. In addition to being leaders in their own offices and the work that requires, they are willing to take on additional projects directed at identifying ways to serve our NLPES membership more effectively. I am equally impressed with our member states—whether taking on the job of hosting a PDS or contributing thoughtful and insightful responses to questions posed to our listserv, they are genuinely interested in facilitating improvements to and supporting NLPES and our profession.
I have had the great pleasure to work with Wayne Kidd from Utah’s Office of the Legislative Auditor General for a number of years now. Wayne will take over as chair at our October PDS and I am excited about the leadership Wayne brings to the group, as well as his vision for how NLPES will continue to grow and push boundaries. Thanks to all of you for the opportunity to serve you as the NLPES chair this past year!
Lisa Kieffer is the NLPES Executive Committee Chair for 2013–2014. She can be reached at Kiefferl@audits.ga.gov.
Welcome to another edition of your favorite round-up of the best in audit and evaluation reports! As usual, our dedicated membership has not let seasonal amusements distract them from raining audit/evaluation excellence down on state governments across the nation. The public policy issues that are the subject of this installment include child welfare services, IT security, correctional institutions, and public health. This Report Radar also includes the ever-popular ‘miscellaneous’ section, but we are re-naming this the ‘very interesting’ section in recognition of the nature of a few reports we would like to highlight. Of course, all these reports are interesting…so, read on!
Child welfare services State programs aimed at protecting children from abuse and neglect have long been an area of interest for many of our legislatures. There has been a noticeable increase in activity relating to child protective services programs over recent months, and we highlight four reports addressing these issues. April saw the release of reports from California and Louisiana, both addressing issues relating to initial intake procedures and how child welfare agencies respond to situations where children may be at risk. Alternative approaches to addressing child welfare issues can be found in two other reports from member offices released in June: the Texas Sunset Commission released a report on the state’s Department of Family and Protective Services that provides a great overview of types of systemic management challenges these kinds of agencies typically face. And the Colorado Office of the State Auditor released a report focusing on the role of the state’s Child Protection Ombudsman Program and the effectiveness of oversight activities performed by a non-profit service provider contracted to deliver these services.
IT security It seems like every week brings another news story about a big data breach potentially impacting large numbers of unsuspecting citizens. State government is not immune from these threats, so if you want to heighten your awareness of IT security issues, we recommend two recent reports: our people in Kansas released a report in July that looks at the types and volumes of sensitive data stored by state agencies and the resources available to protect such data from unauthorized access. The report superbly summarizes information on the types of sensitive data handled by state agencies. You might also want to locate a copy Washington State Auditor’s Office’s April report that looked at the issue of removing confidential data from surplus IT equipment sold by the state. This is a topic we have encountered before in audit or evaluation reports, but also is a vulnerability that continues to threaten IT security in some states and might worsen as portable devices become more ubiquitous and disposable.
Correctional institutions High incarceration rates and ever-increasing budgets are causing more of our offices to focus on issues regarding correctional institutions. Two recent reports from member offices continue to highlight this trend: Georgia released a report in May addressing high turnover rates among juvenile correctional officers. The audit identified compensation, agency management, work conditions, facility leadership, and recruiting practices as the primary causes for turnover issues. Another came from California, which released a report in June addressing sterilization of female inmates. This report documents several lapses in the informed consent procedures the state’s correctional institutions were supposed to follow prior to performing these medical procedures.
Public health Report Radar found four excellent reports to share with you under the broad banner of public health issues, the first two of which feature issues that have been much in the news lately: childhood immunizations and prescription drug abuse. The Montana Legislative Audit Division released a report in April addressing the state’s efforts to promote immunization against communicable diseases in childhood. The report includes information on statutory immunization requirements, monitoring and reporting by child care facilities and schools, and use of the state’s immunization registry system. In April North Carolina also released a report addressing efforts to monitor and prevent abuse of prescription drugs. With more states mandating use of prescription drug registries or other tracking systems, and more attention being paid to high overdose rates and increasing drug costs for Medicaid and other programs, many of our offices may be looking at these issues in the future. This report is a good starting point. A couple of other public health reports caught our attention for focusing on an issue that has been a recurring theme for member offices in recent years: emergency medical services. A June report from Utah and a July report from Indiana address the effectiveness of state programs relating to emergency medical services and trauma care. Both are good reads and worthy of your attention.
Very interesting (formerly ‘Miscellaneous’) Last, but not least, the much anticipated ‘very interesting’ section. Report radar always like to identify some of the more intriguing questions our membership answers for legislatures, and some of the innovative products we release and emerging issues we cover. Do you want to know how to best evaluate whether concrete or asphalt is a better pavement surface option in road rehabilitation? Find out in a March report from our friends in Minnesota. Is your legislature asking about the effectiveness of efforts to re-integrate returning veterans? Read the July report from Tennessee addressing the state’s Helping Heroes educational grant program for returning veterans. Does somebody want to know more about how federally-funded programs impact the citizens of your state? Get a copy of a June report from Virginia, which answers this question and, in the bargain, features some killer charts/graphics. Finally, are you getting questions about regulating the use of drones or unmanned aerial vehicles? Then you are in the same boat as our people in Connecticut, who released a scoping paper in June addressing an upcoming review of this very subject. For answers to the drone question and much, much more, look out for the next edition of Report Radar.
Angus Maciver is the Deputy Legislative Auditor for the Montana Legislative Audit Division. Pester him at email@example.com.
The Legislative Finance Committee (LFC) was established as a fiscal and management arm of the New Mexico Legislature in 1957. Since its inception, LFC's role in the state budget process has grown as the complexity and size of the budget has increased. New Mexico is now one of just five states in which a legislative agency—i.e., the LFC—prepares a comprehensive budget recommendation along with the executive branch.
The LFC’s Program Evaluation Unit was established in 1991. Its mission is to help New Mexicans get the most from their tax dollars by reviewing the costs, efficiency and effectiveness of state services. LFC program evaluations cover the breadth of state government responsibilities, from public schools to information technology. The unit is headed by a deputy director of the LFC, and has a program evaluation manager and 12 evaluators. Program evaluators typically have graduate degrees in public administration, public policy, education, law, social science, business administration, social work, or public health.
In the past year, the LFC issued 17 reviews and 25 progress reports on previous reviews. A recent review assessing the impact of child care and prekindergarten on future student achievement was featured in the Fall/Winter 2013 issue of this newsletter’s Research and Methodology section. Other evaluations covered the impact of prison programming on recidivism, the management of public employee health benefit plans, waste within several capital outlay projects, high school senior preparedness for college, and the effectiveness of child protective services to prevent serious cases of neglect and abuse. The evaluation unit is also responsible for auditing the state’s General Appropriation Act each year.
For the past several years the LFC has also been working with the Pew-MacArthur Results First Initiative, a project of The Pew Charitable Trusts and the John D. and Catherine T. MacArthur Foundation, to implement a cost-benefit analysis tool that provides policymakers with new information estimating the long-term costs and benefits of investments in public programs. Outcomes from the Results First approach have helped identify poorly performing programs and helped target $49.6 million for evidence-based programming in early education, child welfare, and criminal justice.
In addition to recommending a state budget and overseeing program evaluations, the LFC also prepares legislation addressing financial and management issues of state government. These recommendations are often driven by the LFC’s reports.
Although the LFC has existed for less than a century, Santa Fe is the oldest capital in the United States. At 7,000 feet above sea level, it is also the highest in elevation. Santa Fe's history as a capital city dates to 1610, when conquistador Don Pedro de Peralta established it as the capital for the Spanish “Kingdom of New Mexico.” The Palace of the Governors, built in 1610, served as Spain's seat of government. It now houses the state's history museum and is the oldest public building in the country.
Jon Courtney is a Program Evaluator with New Mexico’s Legislative Finance Committee. He has a PhD in experimental psychology and plays in a rock band. He can be reached at firstname.lastname@example.org.
[Editor's Note: This article is the first in an occasional series of articles expertly culled from past issues of the newsletter. Thanks to James Barber from Mississippi for identifying this "oldie but goodie." We trust you will find the advice herein equally as valuable as it was upon first publication.]
Over the years, Wisconsin has had the opportunity to look into a number of different concerns related to our state university system’s operations. Although we’ve learned some specific lessons from each experience, some more general guidelines—which get passed on from audit team to audit team—seem to have developed over the years. The following are a few of the pointers that have become part of our organization’s oral history. While they may not all be applicable to your situation, they may help you avoid some real headaches (or is that heartaches?) in completing any type of higher education audit.
Assign a Ph.D. to the audit team. Just kidding. However, you do need to accept the fact that it is unlikely the majority of any audit team will likely have the credentials necessary to meet the resume test of some academics. (You know the line: “Why do you think you’re qualified to be looking into this issue?”) This means that, from time to time, it may be difficult to capture the attention of some key players. Be patient. After a while, most everyone will figure out you really should be taken seriously.
Be aware of mission creep. Most higher education institutions have broad—very broad—missions by design. Because of this, it is very easy to justify most activities. After all, who is to say whether a given activity “promotes service to the public”? Yet, a broad mission statement is not a license to steal (so to speak). Don’t let that broad umbrella of the mission (including all those references to academic freedom) keep you from asking the hard questions.
Respect the implications of shared governance. Remember that although any given institution will have an executive head, decisions are often made through a system of shared governance. For example, at each of the campuses within the University of Wisconsin system, faculty, academic staff and students all have a role—defined by statute—in shaping policies and procedures. This is in addition to the authority vested in the Board of Regents, the system president, and each institution’s chancellor. This power structure can affect your work in several ways because—unlike the structure in most executive branch agencies—it is not conducive to developing a single response to your findings. In other words, don’t think it’s safe to get in the water just because the campus administration has decided to embrace your report. The faculty may be swimming around out there, waiting to take a bite out of you and your “questionable methodology.” Which leads me to the following:
Expect a challenge. No matter what you do, recognize that higher education institutions are usually held in high esteem (at least on an intellectual basis) within the community. After all, look at all the really smart people who work there. So, be prepared for the underpinnings of your analysis to be questioned (and I mean really questioned). In addition, it is highly likely that the news media and general citizenry will be accepting of whatever the institution says about your report, never mind that no actual support has been provided. Often, the word of a well-known faculty member or administrator can be enough. (Hmm, I think this is related to my first point.) Again, be patient. If you wait out the initial firestorm of rhetoric following a report’s release, your findings (as based on fact) will usually prevail.
Remember, you are not alone. When (notice I did not say if) the going gets tough, remember there is a network of support within NLPES that can help you think through a problem, perhaps dispense some wisdom, and provide moral support. Many of us have been there before, so you should tap into NLPES’ resources. After all, there’s no reason to enroll in the school of hard knocks just because you are taking a look at issues related to higher education.
Jennifer Noyes formerly was a staffer with the Wisconsin Legislative Audit Bureau. This article first appeared in the October 1994 issue of the NLPES News.
Every legislative program evaluation shop tries to ensure its work is responsive to the interests of legislators. For the last 25 years, the Minnesota Office of the Legislative Auditor has, among other things, employed a survey of legislators to help us identify interest in and narrow possible evaluation topics.
Each spring, the Legislative Audit Commission develops our annual work program. We assist by soliciting and collecting evaluation ideas from legislators, legislative staff, our staff, and the public. The first cut of our list frequently contains more than 100 topics, which we present to the commission. From this list, the commission selects about 12 to 16 “semi-finalists.” Our final work program typically has only a handful of topics; but to get there, we do several things—generally over the course of about two weeks—to help the commission make its final decisions. First, we develop a “background paper” on each of the semi-finalist topics, describing the entity to be evaluated, identifying possible issues, and discussing whether the topic meets various criteria (such as amount of state resources at stake, timeliness of an evaluation, and feasibility of a review).
Second, and the focus of this article, is we survey all legislators. The Audit Commission usually tries to select topics in which there is a significant amount of legislative interest. Every year, we send each legislator a list of semi-finalist topics with a brief indication of the possible focus of each evaluation (generally, two or three research questions). We also give legislators a paper “ballot” asking them to vote on their topic preferences. Legislators can respond anonymously if they wish.
Right in the middle of session! This survey of all 201 legislators occurs at a time of peak activity during the legislative session. Members are meeting long hours, passing budget and policy bills, and in constant demand from lobbyists. Surely legislators don’t take time to respond to a survey on evaluation topics, do they? In the early years, our response rate ranged from 30 to 50 percent—okay, but not stellar. Over time, however, the survey has become an accepted part of the Legislature’s routine, and our recent response rates have been quite good—74 percent in 2014, 71 percent in 2013, and 62 percent in 2012.
How do we get them to respond? During the survey period, legislators make announcements from the House and Senate floors reminding members to return their surveys. There is also a fair amount of behind-the-scenes discussion, and partisan staffers frequently collect and return completed surveys from their respective members. At times, interest groups lobby legislators about which topics to express interest in.
Survey results influence the LAC. The final decision on which topics to authorize belongs to the Legislative Audit Commission. However, some survey results have created interesting dynamics. First, it is difficult for the commission to ignore topics that are the highest vote-getters. In 2014, the topic with the most votes (by a wide margin) was Minnesota’s state-run health exchange for the federal Affordable Care Act. Not surprisingly, the commission authorized an evaluation of this topic. However, the commission does occasionally reject topics that fared well on the survey if there are competing topics in the same broad area (such as K-12 education), or in cases where the timing is not right for a study—for example, due to data availability or studies being conducted by other organizations.
Conversely, topics that garner interest from only a few individual legislators are often rejected by the commission. For example, in 2013 some legislators advocated having our office examine whether arenas funded partly with state money have concerts that charge exorbitant prices to consumers. It was an odd topic and finished at the bottom of the legislative survey—and the commission rejected it. On the other hand, some topics that do not show widespread support on the survey have been approved, particularly if an audit commission member is a strong proponent.
But we influence the survey. Surveys also have the potential to work against smaller, more obscure topics that nevertheless would make excellent evaluations. To counter this, in 2013 we divided the survey into two parts: one for “larger” and one for “smaller” evaluation topics. We asked legislators to vote for some topics in each category. This helped ensure the commission seriously considered less-visible topics that may otherwise have fared poorly on the survey. For example, we ended up doing a short but interesting evaluation of highway noise barriers.
And we add our two cents’ worth. We also counterbalance survey results by ranking all the semi-finalist topics. We rate topics from “most promising” to “least promising,” based on a variety of criteria. This gives us an opportunity to highlight topics that were not top vote-getters but which, in our opinion, should be seriously considered for an evaluation.
A legislative survey like this may not work well in all states. Even in Minnesota, it took some time for it to become an accepted part of the legislative process. But for many years now, it has been a valuable tool for soliciting input from legislators during our topic selection process, and legislators are usually grateful for an opportunity to share their opinions.
Joel Alter is a program evaluation coordinator with Minnesota’s Office of the Legislative Auditor. He can be reached at email@example.com.
Forty years ago, NCSL held its first "Annual Meeting" in Philadelphia. This annual NCSL event provides an opportunity for legislators and legislative staff from around the country to gather to address policy issues, share ideas, and network with colleagues.
This year’s annual meeting—now known as the Legislative Summit—was held at the convention center in downtown Minneapolis, Minnesota. In keeping with its location, the theme for the Legislative Summit was "1,000 Ideas from the Land of 10,000 Lakes."
As usual, it was a whirlwind weeklong event. A few of the many highlights included:
Resources from the Legislative Summit are available on NCSL’s website as are video clips from the Summit.
On the lighter side, we were also treated to a reception at the Mill City Museum, a testament to Minnesota’s flour milling industry; a social event at Nicollett Island (in between Minneapolis and St. Paul), which featured championship log rollers and lumberjacks, craft beers, and live music; and a reception for Legislative Staff Management Institute alumni to network and reminisce.
And on the truly lighter side (well, maybe the fattier side?), a few of us were able to sneak a trip to the Minnesota State Fair, which we were told is the largest in the nation after Texas’. Intrepid Minnesota host Joel Alter expertly guided us through the so-called Dairy Barn, at which we witnessed a Princess Kay of the Milky Way butter sculpture in progress, to the Horticulture Barn—featuring 795 lb. pumpkins as well as “seed art”—to the Food Barn, wherein we partook of embarrassingly tasty fried cheese curds, and on through various livestock displays such as the Poultry Barn (which actually featured rabbits and sheep), Horse Barn, Cow Barn, and Swine Barn (which featured an Oink Booth, not to be missed). Was it an experience? Oh, yah, you betcha!
NLPES EC News
Your NLPES executive committee also met during the Summit to finalize details for the upcoming Professional Development Seminar in Raleigh, NC, in October and to report on activities by the Awards, Communications, and Professional Development subcommittees since they last met in April 2014. The meeting was the last as chair for Lisa Kieffer; October’s meeting will mark the beginning of Wayne Kidd’s term as chair.
Every May the NLPES Awards Committee offers awards in four categories: Excellence in Evaluation; Excellence in Research Methods; Certificates of Impact; and Outstanding Achievement. This year’s winners have been notified and will be officially presented with their awards at the Professional Development Seminar in Raleigh, NC in October.
Congratulations to the following offices:
Details about the winning offices and reports are posted on the NLPES Awards website.
Marcia Lindsay is the 2013–2014 chair of the Awards Subcommittee. She can be contacted at firstname.lastname@example.org.
Congratulations to Jennifer Jones, Deputy Director of the Texas Sunset Advisory Commission, who has won a 2014 NCSL Legislative Staff Achievement Award. The award is given to one or two recipients by the NCSL Standing Committees to recognize staff excellence in supporting the work of a state legislature and strengthening the legislative institution. Jennifer has worked for the Sunset Commission for 21 years, has been active on various NCSL committees and task forces, and currently serves as staff co-chair of the NCSL Natural Resources & Infrastructure Standing Committee. Congratulations, Jennifer!
Let us know if you have staff news to share! Email email@example.com
Legislative program evaluation shops across the country regularly garner media attention for their work, an indicator that stakeholders are paying attention, and increasing the likelihood recommendations will be implemented and performance will improve. Below are recent program evaluations that attracted media attention.
OKLAHOMA – STATE AUDITOR AND INSPECTOR’S OFFICE
State audit finds numerous problems with Rogers County finances
April 2, 2014 – News On 6
Officials failed to document millions of FEMA funds, and vendors altered invoices at the commissioner’s request (June 2014). Full report and other video articles
LOUISIANA – LEGISLATIVE AUDITOR’S OFFICE
Louisiana nursing homes rank poorly, report states
June 15, 2014 – The New Orleans Advocate
The state’s nursing homes rank among the worst in the nation in key quality measures, with high rates of bed sores and use of restraints among the 25,000 elderly and disabled people who live in them (June 2014). Full report and other video articles
MICHIGAN – OFFICE OF THE AUDITOR GENERAL
State misspent in-home care funding
June 18, 2014 – WEMU 89.1
Michigan improperly paid $160 million in a 29-month period for services provided to vulnerable, low-income adults in a program designed to keep them out of more expensive, long-term care. Among those on the payroll were convicted criminals (June 2014). Full report
CALIFORNIA – OFFICE OF THE STATE AUDITOR
California state audit says female inmates were sterilized illegally
June 20, 2014 – Time.com
Some inmates were sterilized unlawfully, and safeguards designed to limit occurrences of the procedure failed (June 2014). Full report and other audio articles
MINNESOTA – OFFICE OF THE LEGISLATIVE AUDITOR
State audit report critical of Running Aces harness track and racing commission
July 8, 2014 – Star Tribune
The park fell $436,865 short in paying purse contributions to horsemen racing at the Columbus harness track from 2008 through 2012 (July 2014). Full report and other video articles
Share your coverage with us! If you would like an article highlighted in our next newsletter, send a hyperlink to firstname.lastname@example.org
NLPES is currently updating Ensuring the Public Trust, an invaluable overview publication of legislative evaluation offices across the country.
Which shop is the oldest? How many are using social media? How much time do offices spend on a typical review? How can you contact a sister office in another state? Browse the EPT publication to find out.
The final report will be provided at the NLPES professional development seminar in Raleigh, NC, on October 5-8, 2014 and will also be available online. Look for it this fall!
Current and past editions of EPT are available.
NLPES website—Learn more about NLPES and see what we do by spending a few moments touring our NLPES website. You’ll find general information about the NLPES, including by-laws, executive committee membership and subcommittees, state contacts, awards, and information on peer review. We also have a training library and resources including past meeting minutes, newsletters, and more. Check out our website resources!
NLPES listserv—The NLPES listserv is an email discussion group for NLPES members. By sending a message to email@example.com, you can reach all listserv subscribers simultaneously. Listserv members:
To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv.
See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv. You’ll be glad you joined!
Legislative careers website—Know a young professional thinking about pursuing a career with a state legislature? Point them to the opportunities posted NCSL’s legislative careers website. Job seekers can explore the types of work legislative staffers perform, including performance evaluation, budgeting, fiscal analysis, legal and policy research and opinions, bill drafting, public relations, librarians, building security, and information technology support. Opportunities are posted by states offering positions under Legislative Jobs. Launched by NCSL in June 2012, this is a great website. According to NCSL, attracting young people to work as legislative staff will be increasingly important in the coming years. And even though baby boomers make up about a third of the national workforce, nearly half of legislative staff responding to a survey were 50 years old or older. Replacing those staffers will present challenges. Check out the welcome video, “A Day at the Capitol,” and learn more about the opportunities and rewards of working for state legislatures. Watch the videos under Career Paths to hear from staffers in different states describing their jobs and the satisfaction they gain from a public service career.
Online training library—NLPES Training Products Matrix
For a variety of refresher and training materials, visit our NLPES online training library, where there is a wealth of resources on critical thinking, findings savings, interviews, quantitative methods, samples, survey development, reviewing contracts, effective presentations, report writing, and various management topics.
Ask GAO Live
Have you seen this website? We just discovered AskGAOLive, a 30-minute interface where GAO staff chat about a specific report and research, and answer questions that are emailed or tweeted in. Sessions are recorded and archived on the website. You can also “follow” GAOLive to receive advance notice of chat sessions. Topics include veterans and higher education, prescription drug shortages, prison overcrowding, state and local fiscal outlook, and government contracting.
Where does your state rank in this highly unscientific, totally subjective survey by Thrillist.com editors Kevin Alexander and Matt Lynch? Do you even want to know? We here in Hawai‘i are sipping cocktails with pieces of pineapples and little umbrellas in them while arguing amongst ourselves about the merits of Spam musubi...
2014 NLPES Professional Development Seminar—will be held on October 6–8, 2014 at the Sheraton Raleigh Hotel in Raleigh, North Carolina in conjunction with NLSSA. For more information, go to the 2014 NLPES PDS website.
The Working Paper is published three times a year by the National Legislative Program Evaluation Society, a staff section of the National Conference of State Legislatures. NLPES serves the professionals of state legislative agencies engaged in government program evaluation.The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Visit the NLPES website
2013–2014 NLPES Communications Subcommittee:
Dale Carlson (CA)
Charles Sallee (NM)
Rachel Hibbard, newsletter editor (HI)
NCSL Liaison to NLPES:
Brenda Erickson, (303) 856-1391
NCSL Denver Office • (303) 364-7700