Winter (January) 2021
The Working Paper is the official newsletter of the National Legislative Program Evaluation Society. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
Hang in there!
Y’all, what a year. This time last year I was scrambling to sign up my three school-age children for summer camps. Yes, in my neck of the woods, you have to sign kids up for summer camp in February because it’s cutthroat. Like a lot of you, I’m a planner. I like to anticipate and control what I can. It’s made me a good project manager at work and at home. I knew I was slated to become NLPES chair in the fall of 2020. The Executive Committee has a great pipeline for preparing folks to be chair so I was as ready as I could be. What I did not know was that I would be named director of our division at the exact same time I became chair. Or that my children would only be in school for one of the last 10 months with no sense of when they get to go back. I’d like to think COVID has made me more flexible and adaptable. I’ve printed out a new motto: Progress Over Perfection. And I’m trying hard to remember it when I feel overwhelmed. But I’ve also got my 2021 summer camp schedule going already because there’s something very reassuring about an Excel spreadsheet :)
To help you plan your professional year, here’s what the Executive Committee is working on for you:
Also, know that the Executive Committee is cheering on the good work of our member offices! Our research and reports can help to bring some certainty to a world where so much seems uncertain. Hang in there!
Kiernan McGorty is the 2020–2010 NLPES Executive Committee Chair. She can be reached by email or (919) 301-1393.
Excellence in Research Methods | Sarah Dinces, Ryan Tolman and Jon Courtney (New Mexico)
Two offices received the Excellence in Research Methods Award in 2020: Washington Joint Legislative Audit and Review Committee and New Mexico Legislative Finance Committee. This is the second of a two-part review of the winning reports and the methodologies applied by each
The Report Objectives
New Mexico’s Legislative Finance Committee (LFC) asked staff to evaluate whether continued incentive-based spending on subsidized quality childcare programs resulted in better outcomes for children participants. Improving school readiness is of particular interest to the state because New Mexico consistently ranks near the bottom in measures of child well-being and educational attainment. Previous LFC research has found that one-quarter of children entering kindergarten cannot read one letter and 80 percent of students start behind on the first day of school. In 2019, the state implemented a new quality rating and incentive system intended to improve several childcare outcomes, including school readiness. Of the $139 million spent on childcare assistance in New Mexico in FY19, $64.5 million was dedicated to quality incentives, with higher rates being paid to more highly rated providers as defined by a number of different inputs (e.g. higher quality providers have lower teacher to student ratios.)
New Mexico childcare has one of the most ambitious missions of any state program. The program started primarily as an income support program that would also promote the health and safety of children. Over the years, this mission expanded to include providing social-emotional supports and culturally and linguistically appropriate information to young children and generally preparing them for kindergarten. Since 1997, New Mexico has paid different subsidy rates to providers based on their quality rating. Providers with higher quality ratings receive higher subsidies, presumably leading to better outcomes as a result of corresponding inputs of more highly rated providers.
New Mexico is currently operating its third-generation quality rating improvement system for childcare. The effect of the state’s second-generation quality rating system was previously examined by LFC, with staff finding minimal impact of quality ratings on student readiness. In other words, the state was providing financial incentives to “higher quality” childcare providers. Yet, the children in these high-quality childcare programs were no more likely to be prepared for kindergarten than children that attended lower-rated childcare programs.
In order to see if the third-generation quality rating system was yielding any better results, LFC staff in 2019 examined the impact of this latest rating system on school readiness, child health outcomes, and family well-being. The objectives of the study were to determine if best practices were followed in state-funded high-quality childcare, and to ascertain how participation in the state childcare subsidy program impacts educational achievement, health outcomes, and family income. How program quality impacts these outcomes was also an objective of the study.
To address LFC’s request, staff considered several methodologies, including a review of early childhood best practices and a number of inferential statistical methods. To complete the selected inferential statistics, the staff completed an unprecedented merge of data collected from multiple state agencies, creating a new unique identifier to link participants in childcare assistance with public school performance data through the third grade, Medicaid data, and tax data. To examine educational outcomes, LFC staff developed a cohort of three- and four-year-old children who participated in child care between 2015 and 2018 and followed these students over six-years, tracking participation in childcare assistance, participation in other programming (e.g., prekindergarten), and kindergarten through third grade.
LFC Collected Data from Multiple Agencies to Determine the Effect of Childcare Assistance
LFC staff addressed a number of concerns analyzing these data. Potential selection bias was a primary concern since this was not a randomized control study. To compensate, staff used propensity score analysis to minimize the impact of selection bias. The comparison group was matched on factors including school district, low income status, English language learner status, and their disability status. Another concern was the potential for a fade-out effect, meaning by the time a student reaches third grade the school or school district might account for more educational progress, or lack thereof, than participation in an early childhood program. To address this issue, staff chose to conduct hierarchical linear modeling (HLM) as it reflects the multilevel nature of data. For example, students are nested within classrooms, which are nested within schools, which are nested within school districts. The failure to consider the outcome data in each of these contexts, and interactions among them, could have limited the understanding of the contributing factors such as individual demographic characteristics, classroom organization, or school district policies and can potentially lead to erroneous conclusions.
Analysis showed that participation in childcare assistance increased family income and improved child health outcomes, however there was little evidence that childcare improved school readiness. Furthermore, the quality rating of a childcare program did not affect any measured outcomes. This calls into question the value of the $64.5 million annual investment in differential reimbursement rates for quality in the childcare program.
To this end, important elements of a quality ratings system proven to affect outcomes such as teacher-child interactions are not being tracked. In contrast, staff found that participation in New Mexico prekindergarten led to significantly higher scores in third-grade reading and math achievement. In subsequent longitudinal analysis, staff also found New Mexico prekindergarten to be associated with increased high school graduation using a similar methodology. These findings supported the results of both the HLM and propensity score matching statistical analyses and prompted a recommendation that the state childcare assistance agency incorporate quantitative assessments of teacher-child interactions and classroom environment into the state’s quality rating system using validated measures and study how to include outcomes for childcare assistance as a metric of quality. Staff also recommended that the state’s prekindergarten program should continue to be expanded as it is improving school readiness.
Sarah Dinces and Ryan Tolman are program evaluators at the New Mexico Legislative Finance Committee. Jon Courtney is deputy director at New Mexico’s Legislative Finance Committee. Read the report
Editor’s note: If your office would like to highlight its award-winning report in the Working Paper, please let Eric Thomas know.
Southwest Idaho Treatment Center | Ryan Langrill (Idaho)
The Southwest Idaho Treatment Center (SWITC) is Idaho's only state-run institution serving people with intellectual disabilities and behavioral issues. Its goal is to help stabilize clients in crisis and prepare them to return to the community. In March 2018, the Joint Legislative Oversight Committee directed the Office of Performance Evaluations (OPE) to evaluate SWITC. I was named the team lead and was the sole evaluator on the project (though I had help from two consultants and an intern).
At the start of the evaluation, SWITC had a troubled recent history. In August 2017 alone, two staff were found to have been physically and psychologically abusing clients; staff found a client six hours after he had died of apparent suicide when protocol required checks on the half hour; another client left campus three times and was arrested after reportedly hitting a woman in the head with a rock thrown through her house window. After a second consecutive failed inspection, SWITC was at risk of losing certification and therefore Medicaid funding.
I am an economist by training and my instinct is to find or develop quantitative measures of performance. However, it was quickly clear that little quantitative data existed about SWITC's problems and I would need to take a qualitative approach. The best way I could think to do so was to embed myself within the organization.
The organization charged with the protection and advocacy of people with intellectual disabilities gave me their observation rubric, but most of the work was in earning people’s trust and understanding their position. I used several strategies to connect with staff.
I held office hours about three days a week and invited anyone to drop in. SWITC administration helped by giving me an unused office called the 'fishbowl.' It had a large window facing into a hallway that funneled most people out of the administration building. The office hours allowed staff to drop in during their breaks and to communicate their concerns as they happened. It also let me interact organically with clients. One SWITC client visited me each week as part of a routine to check-in and collect a new pen; another came in and took my lunch right in front of me.
The staff were very cooperative and committed to their work. One staffer celebrated their 45th work anniversary while I was there. To connect with staff, I went through direct care staff training, which included being physically restrained as part of their crisis intervention training. Each day I visited the four residential units, in two separate buildings, so people would know who I was. I spent full days with staff in each of the two residential buildings. I complemented informal observations and conversations with some formal interviews, with the assistance of an outside consultant, and with reviews of comparable training protocols from other institutions.
The campus is a reflection of the organization: four inhabited buildings stand among twice the number of abandoned buildings, including an old hospital which had been used only for active shooter drills over the past decade. While the campus used to be home to almost 1,000 people, by 2018 it had been reduced to only about 20. The reduction in size and transition away from the medicalization of treatment came as Idaho transitioned to a focus on person-centered and community-based treatment.
One of our consultants had toured the current residential buildings just after they were built in 2005; even back then, staff were frustrated because no one involved in operations had any input in the building's design. I observed similar frustration from staff who felt that administration, siloed in its own building, made decisions without consulting staff. The living areas felt to stakeholders more like hospital waiting rooms than people's homes. The buildings did not further SWITC’s goal of helping clients prepare for community living and put clients and staff at risk through poor lines of sight. The buildings were used not because they were clinically appropriate, but because they were available.
SWITC’s therapist described to me the trauma-informed treatment approach he had been seeking to implement during the evaluation. Common symptoms of trauma include despair and a loss of hope, reliving the traumatic event, closing oneself off from other people, and the erosion of one's identity. After that conversation, I realized these symptoms mirrored SWITC as an organization. The organization had been traumatized by the downsizing of two-thirds of the staff over the past decade as well as the terrible events of the previous year.
Findings and Recommendations
In the end, the report had some of the most negative findings we have written in my time at OPE. Writing the report was hard. The community had attributed SWITC's problems to the moral failings of the administration, the staff, or the clients. The organization was already traumatized. While we had to be critical, we didn't want to simply add to the trauma. Here were our main findings.
We made two core recommendations. First, that the Legislature direct the department to develop a long-term vision for serving people with developmental disabilities. Second, that SWITC develop and implement a formal strategic plan and quality improvement process.
We delivered the report to the director of the Department of Health and Welfare for a formal response three days after the director had been hired. In his response, he wrote that “these recommendations are absolutely on target” and committed to their implementation. The commitment wasn’t just from the top; everyone from the staff on up knew things weren’t working and wanted them to improve.
The department made our first recommendation part of their formal strategic plan. A group composed of legislators, state and local officials, advocates, and the parent of a SWITC client worked with the department to develop the vision. They plan to present it to the Legislature in 2021. The plan consists of turning SWITC into a two-tier treatment model: an acute care unit for when people are in crisis, and homelike apartments for clients to better transition into the community. It also consists of investing in community resources, including developing a license for a residential treatment model focused on serving people with autism.
Through we have not yet done a follow up review, SWITC reports many internal changes that appear promising. They hired a recruiter, which improved hiring direct care staff. They doubled clinical staff and created a team of high-quality direct care staff dedicated to managing crises and training other staff while working. SWITC dedicated staff to quality control and created a position entirely dedicated to investigating allegations of abuse and neglect. They also created a committee to suggest safety improvement which includes direct care staff. I look forward to the OPE conducting a full follow-up review to see how things have changed for clients and for staff at SWITC.
Ryan Langrill, Ph.D., is a principal evaluator with the Idaho Office of Performance Evaluations.
Lessons from the Life and Death of Alice Carter | Kathy Patterson (District of Columbia)
On Dec. 17, 2019, and just shy of her 36th birthday, Alice Carter, a transgender woman with mental illness and a substance use disorder, drank herself to death on a freezing District of Columbia sidewalk. She was found unconscious, treated by paramedics, and taken to the hospital. She died there the next day, never having regained consciousness.
The next month, on Jan. 10, 2020, The Washington Post published a short guest opinion piece describing Alice’s turbulent life. The author proposed that policymakers analyze “the effectiveness of our community response to people with mental illness” to find lessons from “how we failed her.”
At that moment the Council for Court Excellence (CCE), a 40-year-old civic organization working to improve the District’s justice system, was beginning to draft a report for my office, based on the first-ever analysis of District agency data on substance use disorder services (SUDS) for individuals in the justice system. It was the third partnership between CCE and the Office of the D.C. Auditor ODCA) and would recommend a more comprehensive set of collaborations across D.C. and federal agencies to provide more consistent support for a particularly vulnerable population.
That population clearly included Alice Carter. She suffered from sexual assault, physical abuse, and homelessness and had multiple interactions with the District’s criminal justice and social services systems over more than a decade and interventions with dozens of professionals – including social workers, medical care providers, and attorneys.
ODCA is a legislative audit shop created when the District of Columbia achieved a measure of home rule in the 1970s. I’m a former newspaper reporter and former three-term member of the D.C. Council, and was named D.C. Auditor at the end of 2014. In addition to the agency’s broad-ranging authority, I was given a welcome mandate by the D.C. Council to develop a fresh approach to the audit and analysis work of the office: to basically try new ways of informing elected officials and help achieve a stronger, more effective government. We’ve contracted for public opinion surveys, done “secret shopper” investigations, and have made use of the extensive subject-matter expertise found in the nation’s capital.
As an ex-journalist one important question I bring to all our projects and teams prior to publication is this: so what? Why is this important? What are we really saying? How can what we put on paper have a positive impact on the residents and taxpayers we serve? The report we were about to publish on SUDS was a huge contribution to the public’s understanding of a difficult subject. The CCE team created and analyzed a person-level data set that matched data across five health and justice agencies—the Department of Behavioral Health, Department of Corrections, the Department of Health Care Finance, the Office of the Chief Medical Officer, and the Metropolitan Police Department. The analysis followed the contacts of justice-involved adults through the stages of assessment and treatment in the community, arrest, incarceration, and release back into the community. It was the first time that such a dataset has been assembled and used in the District of Columbia to analyze the interrelationship of SUDS, justice-system involvements, and deaths.
According to federal government estimates, more than one in 10 D.C. adults have a SUD; in 2017 alone, the number of lethal overdoses approached 300. Individuals struggling with SUDs frequently become caught up in a revolving door of arrest, judicial proceedings, incarceration, release, and re-arrest. We found that while continuous care is rare, it can be successful for individuals and that continuity needs to be a consistent goal going forward.
To help tell the story presented by the comprehensive data and policy review, we proposed a second partnership with Street Sense Media (SSM), a nonprofit organization working to end homelessness in the D.C. area. Its executive director, Brian Carome, was the author of the guest piece in The Washington Post about Alice Carter. We contracted with SSM to research and detail the story of Alice’s life and death. Because Carome and his colleagues had known and worked with Alice and had been in touch with her mother, they obtained the family’s permission to review records. The combination of their personal knowledge and ODCA’s authority to access public records—plus willing collaboration from most of the District agencies, including the D.C. Superior Court—allowed the SSM team to put together a searing and compelling story. The report put Alice’s very human face on incarceration policies and practices impacting persons with mental illness and SUDS in Washington D.C.
In August 2020 we published both reports: Lessons from the Life and Death of Alice Carter and the CCE report, Everything is Scattered: The Intersection of Substance Use Disorders and Incarceration in the District.
We got considerable press attention primarily based on the case study of Alice Carter’s life. Both partners—Street Sense and CCE—continue to press for the reforms outlined in the two reports. For my office, the companion reports were an example of making good use of community resources including the nonprofit closely engaged in the lives of the District’s homeless population and committed to improving the lives of those on the street. We continue our partnership with CCE with a new initiative this year that will produce public forums on a range of justice issues that we will turn into reports and podcasts. With this particular partnership, I am able to tell the District’s legislators—my bosses—that I have gotten great value from taxpayer money by contracting with an organization that brings together representatives in the legal, business, and social services world who give their own time to produce research and policy recommendations.
Meanwhile, as we follow up on our report recommendations, we will continue to honor Alice Carter by seeking to help others who struggle as she did.
Kathy Patterson is the District of Columbia auditor. The phone number for the Office of the D.C. Auditor is (202) 727-3600.
An Interview with Mike Powell
Prior to becoming director of the newly-created Office of Program Evaluation and Government Accountability, Michael Powell had a career that included working for state and local government, including time working for Baltimore’s CitiStat Office, which used performance information to manage and assess public sector agencies. Subsequently, he worked as a consultant, much of which included working with non-profit clients and helping them develop performance management systems.
OPEGA, created by the Maryland Legislature in 2019, is intended to provide the state Legislature with a more robust accountability function. There is currently an Office of Legislative Audits that is largely focused on financial audits. The new office will allow the state to assess the outcomes of government spending. He will report to a 20-member Joint Audit and Evaluation Committee, which will direct the office’s workplan, and hear completed reports.
The office is staffed by Michael and two employees, one of whom was hired in April in the midst of the pandemic. He’d like the office to grow to around 10 staff in the future. They are planning to release their first evaluation in September—a review of the involvement of individuals supervised by parole and probation in criminal activities.
He looks forward to building a network of peers through NLPES. He appreciates resources such as the newly-created library of reports and help through listserv emails.
Mike Powell is the director of the Maryland Office of Program Evaluation and Government Accountability.
Linda Triplett has bid a final adieu to Mississippi’s Joint Legislative Committee on Performance Evaluation and Expenditure Review. On Nov. 30, 2020, she retired after 41 years with PEER.
Not only was Linda a valuable employee to PEER, but she was very active in NLPES. She served as a member of the NLPES Executive Committee for six years and was the NLPES chair during 2018. Linda was elected to the NCSL Executive Committee in 2018 and served on it until she retired.
Linda was instrumental in working with the Pew-MacArthur Results First Initiative to advocate for evidence-based programs and to improve agencies’ performance accountability. She was a member of the Evidence Informed Policymaking Work Group that assisted NCSL in the establishment of the Center for Evidence-Based Policy.
It’s now time for Linda to sit back, relax and enjoy. And she deserves it!
Social Media Is Coming to NLPES!
In an effort to provide more ways for staff to share information and engage with each other, the Executive Committee is establishing a LinkedIn account for NLPES member offices. We’ll provide more information through the listserv as we get closer to launching, so be on the lookout.
They Are Just Around the Corner
What is just around the corner, you ask? Well, spring is for one … hooray! But what else? It’s almost time for 2021 NLPES awards and Executive Committee elections. Expect to receive more about these in the near future via the NLPES listserv.
Now available: Who We are and What We Do: A National Survey of State Legislative Program Evaluation/Performance Audit Programs—What’s the most common internal performance measure for evaluation shops? How many offices tweet? What percentage of staff has fewer than 10 years of experience? How can you contact a sister office in another state? Find out in Who We are and What We Do, formerly Ensuring the Public Trust, which provides descriptive information about the legislative offices that conduct performance audits and program evaluations.
Report Library Update—The NLPES Audit Report Library is a topical listing of recent reports that may be of interest in your work. The library pilot went live in the summer, and based on the positive response we received to it, we are working to expand the library from just those reports released by the agencies represented on the NLPES Executive Committee to include reports from all of member agencies. To facilitate the expansion as well as the ongoing posting of reports, the Executive Committee has reached out to the member offices’ directors to appoint Report Library Liaisons to ensure these efforts. The library will include a topical listing of all reports released by member agencies beginning in 2018. Because of some data limitations, it is not a database and the reports will need to be accessed through the individual agency’s website, which is listed with the report name in the library. In addition, the Executive Committee is reviewing the suggestions for improvements to the library received after the library went live. Ideally, we hope to work toward a more user-friendly, high-tech version of the library in the future but in the meantime, as the library continues to grow and you use it more, if you have suggestions for improvements, please send those to Emily Johnson. The goal is to make the library useful to our members and, therefore, it will continue to evolve based on the needs of our members.
NLPES listserv—The NLPES listserv is an email discussion group for NLPES members. Authorized listserv users can query other states about evaluation work, receive announcements about performance evaluation reports and job opportunities from other states, and are notified when the latest edition of this newsletter is available! To join the listserv, send an email to Brenda Erickson, NCSL liaison to NLPES, with the subject “SUBSCRIBE to NLPES Listserv.” Include your name, job title, legislative audit agency/organization name, mailing address (including city, state, zip code), phone number and email address. A “Welcome” message will be sent to you once you are successfully added to the listserv. See the listserv link on the NLPES website for additional information on how to post messages and “netiquette” niceties, like not hitting “Reply All” when answering a query posted to the listserv.
Website staff photos—Take a look at the NLPES website homepage for the new photo ribbon featuring some of your colleagues expressing why NLPES is important to them.
The Working Paper is published two times a year by the National Legislative Program Evaluation Society, a professional staff association of the National Conference of State Legislatures. NLPES serves the professionals of state legislative agencies engaged in government program evaluation. The purposes of NLPES are to promote the art and science of legislative program evaluation; to enhance professionalism and training in legislative program evaluation; and to promote the exchange of ideas and information about legislative program evaluation.
The Working Paper is produced by the NLPES Communications Subcommittee:
Emily Johnson (Texas), 2020–2021 chair
Eric Thomas (Wash.), newsletter editor
Patricia Berger (Pa.), member
Darin Underwood (Utah), member
NCSL Liaison to NLPES:
NCSL Denver Office