Data Plays a Critical Role in Regulatory Innovation
Many states are rethinking how legal services are delivered and regulated in an effort to close the justice gap and connect more people with the help they need when they experience a legal issue. Whether it be implementing a regulatory sandbox, eliminating Rule of Professional Conduct 5.4 to allow alternative business structures to participate in the legal market, or waiving local unauthorized practice of law rules to create an allied legal professional or community-based advocate program, many states are taking meaningful action steps toward regulatory innovation.
The two most commonly cited goals for doing so are 1) to increase access to affordable legal help for people who currently don’t receive it, and 2) to foster innovation. The typical process that states engage in when considering how to reregulate the delivery of legal services includes creating a working group, proposing a framework for regulation, developing a plan for implementation, and then officially launching the program. One critical consideration often missing from the process or treated as an afterthought is creating a data and evaluation plan.
Rigorous data collection and evaluation is critical to understanding whether a state is meeting its stated goals. Without it, proponents and opponents alike can only speculate about the effectiveness of implemented regulatory reforms. When states are having conversations around their stated goals for regulatory reform, they should also discuss how they will measure the success of the program.
The Data We Have So Far
Thankfully, some states have thought about data collection to some degree at the outset of proposing and launching their respective programs. The stated objective of Utah’s regulatory sandbox is “to ensure consumers have access to a well-developed, high-quality, innovative, affordable, and competitive market for legal services.” This regulatory sandbox model was built largely on the model IAALS developed and published in 2019. At the outset, the Utah Supreme Court recognized the importance of data collection and program evaluation, and has tasked IAALS to serve as an independent third-party evaluator for the sandbox.
The Utah sandbox just turned three. In connection with this milestone, IAALS is conducting an interim evaluation of the sandbox, which we anticipate publishing toward the end of 2023. This is part of a broader, longer-term evaluation effort that we expect to publish toward the end of the sandbox’s pilot program, which is set to expire in 2027. The Utah Office of Legal Services Innovation also publishes monthly activity reports.
Utah also has a Licensed Paralegal Practitioner (LLP) program. In the fall of 2021, Ashton Ruff, Anna Carpenter, and Alyx Mark released Utah’s Licensed Paralegal Practitioner Program: Preliminary Findings and Feedback from Utah’s First LLPs.
Additional states that have created evaluation plans and shared data include:
- Board of Nonlawyer Legal Service Providers’ 2022 Annual Report on the Status of the Legal Paraprofessional Program, released in April 2023
- Annual Report of the Committee on Alternative Business Structures to the Arizona Supreme Court, released in April 2023
- Board of Nonlawyer Legal Service Providers' 2021 Annual Report of the Status of the Legal Paraprofessional Program, released in April 2022
- Annual Report of the Board of Nonlawyer Legal Service Providers to the Arizona Supreme Court, released in April 2021
Legal Innovation After Reform: Evidence from Regulatory Change, released by David Freeman Engstrom, Lucy Ricca, Graham Ambrose, and Maddie Walsch in September 2022, provides a more in-depth analysis of the data collected from entities in the Utah Sandbox and Alternative Business Structures in Arizona.
Minnesota launched a Legal Paraprofessional Pilot Project in September 2020. The data collected and shared so far:
- Standing Committee for the Legal Paraprofessional Pilot Project Interim Report and Recommendations to The Minnesota Supreme Court, released March 2023
- Standing Committee for Legal Paraprofessional Pilot Project Interim Report and Recommendations to the Minnesota Supreme Court, released December 2021
Washington launched a Limited License Legal Technician Program in 2015, the first program of its kind in the U.S. The program controversially shuttered and stopped taking new applicants in 2020. Active LLLTs and people who had started down the licensure pathway were permitted to continue practicing and pursing licensure. The data collected and shared so far:
- The Surprising Success of Washington’s Limited Licensed Legal Technician Program, released in April 2021
- “Preliminary Evaluation of the Washington State LLLT Program,” released in March 2017
What the Data Tells Us
The regulatory innovation movement is still nascent, and we therefore only have a limited amount of data upon which to draw conclusions. But at a high level, here’s what we know so far:
We know that new entities and professionals who have been permitted to practice law through regulatory reform efforts have delivered tens of thousands of legal services to people experiencing legal issues. And we know that the percentage of complaints and disciplinary actions for entities and professionals operating under a regulatory change is lower than the percentage of complaints and disciplinary actions for lawyers operating in the traditional legal market. We know that innovative delivery models leveraging tech and other legal professionals are being developed with plans to scale, and we know that state supreme courts and other leaders continue to be inspired by this progress and launch new efforts each year. If states continue to develop and execute data collection and evaluation plans, we’ll have additional data and additional conclusions we can draw as time goes on.
Tips for Creating a Data Collection and Evaluation Plan
While we are grateful for the data we have thus far, it’s not enough to answer some of the most common and critical questions:
- Client protection: Do consumers of legal services delivered through (insert regulatory innovation model here) receive outcomes that are similar to or better than similarly situated legal consumers of traditional legal services?
- Client protection: Do consumers of legal services delivered through (insert regulatory innovation model here) receive better outcomes than similarly situated self-represented litigants?
- Consumer satisfaction: Are consumers of legal services delivered through (insert regulatory innovation model here) satisfied with their experience and outcome, and how does their satisfaction compare with the satisfaction of similarly situated consumers of traditional legal services and self-represented litigants?
It’s important to note here that even if states engaged in regulatory reform efforts had this data with respect to the new regulatory models, we’d only be halfway toward our goal. No state currently collects data on legal outcomes and consumer satisfaction for traditional legal services.
For states who have yet to pursue regulatory reform or who have pursued it but have not yet given thought to data collection, here are a few high-level tips for getting started:
- Include evaluation considerations in even the earliest discussions about developing, proposing, or implementing any reform. When data collection is an afterthought, the quality and quantity of the data suffers, as do the conclusions we can empirically draw.
- Determine what questions your evaluation data needs to answer—these are the questions that are critical for understanding whether a given reform is achieving the desired outcomes and impacts.
- Work backwards from there to identify the data you’ll need to collect to answer those questions, as well as the anticipated source for each element of data.
- Define the data collection strategies and processes that you’ll need to implement to collect the requisite data from each identified data source; this includes identifying who will be responsible for data collection.
- Determine whether the data will be shared publicly.
- Include the requisite amount of money for data collection and evaluation in the program budget and ensure that sufficient resources are allocated to the evaluation effort.
- Keep in mind that evaluation is a marathon rather than a sprint: in any new program, enough time must elapse for an appropriate amount of data to be collected from which conclusions about outcomes and impacts can be drawn.
Empirical evaluation is key to understanding whether regulatory reforms are meeting their stated goals—indeed, it is the only way we can know with any degree of certainty whether our innovations are working. As states continue to consider, develop, and launch new regulatory efforts, it’s important to discuss a data and evaluation plan at the outset. But it’s not too late for existing efforts who don’t currently have a plan in place to get started! There is still tremendous value and insight to be gained from collecting data a few years in. The more data we can collect and analyze as a movement, the faster we can iterate and achieve our regulatory innovation goals.