Survey Reveals How Judges Would Improve the Ways They Are Evaluated

Kelsey Montague Kelsey Montague
Associate Director of Marketing and Public Relations
July 2, 2024

Results will inform new recommendations for state-level judicial performance evaluation designed to benefit both judges and the public they serve. 

IAALS, the Institute for the Advancement of the American Legal System at the University of Denver, today released the results of an eight-state survey of 658 judges designed to understand their perspective on judicial performance evaluation (JPE) processes that assess their job performance. Part of IAALS’ JPE 2.0 project, National Perspectives on Judicial Performance Evaluation shows that while judges generally believe JPE programs are important and feedback is welcome, there are many opportunities for improvement when it comes to the fairness and reliability of the process.

“These critical insights will inform new national recommendations, coming later this year, for improving JPE to better serve both judges and the public who appear before them in courtrooms and who often vote for them in elections,” said Danielle Kalil, IAALS’ Director of Civil Justice and the Judiciary.

Across the study states (Alaska, Colorado, Hawaii, Idaho, Massachusetts, New Mexico, Utah, and Virginia), over two-thirds (68.1%) of judges reported that they were satisfied with the JPE process in their state. Majorities of judges agreed that JPE is a fair assessment of their strengths and weaknesses (59.1%), increases judicial accountability to the public (63.6%), and helps their professional development (72.1%). However, judges tended not to believe that JPE increases judicial independence (26.6%) or helps the public understand the work that judges do (33.5%).

Overall, judges found most tools used to assess their performance to be helpful, including surveys of jurors (94.5% of judges found helpful), surveys of court staff (90.6%), reports from courtroom observers (86.6%), review of written orders and opinions (84.2%), and surveys of attorneys (83.9%). They also viewed the final evaluation report positively, finding it to be a useful format (79.8%), easy to understand (88.5%), and an accurate assessment of their performance (72.0%).

While the numbers paint a picture of a judiciary largely satisfied with the way it is evaluated, judges’ comments and survey responses also point to many serious, legitimate concerns and areas to explore for improvement, including:

  • Fear and stress surrounding the process
  • Reliability of survey data
  • Gender and racial bias, as well as political agendas, impacting results
  • Inability to provide responses or context to critical comments
  • Lack of public awareness about JPE
  • Negative effects on judicial independence
  • Perception of JPE as a popularity or personality contest
  • Lack of recourse for judges who feel a result is unfair
  • Lack of accountability for evaluators
  • How commissions interpret evaluation results

“To date, research on how to best evaluate judges has not included this much-needed perspective of judges across states, which is critical for well-rounded and useful improvements to state evaluation programs,” said Honorable Adam J. Espinosa, Denver District Court Judge. “If judges do not have confidence in the evaluation process, they are unlikely to modify their behavior in response to evaluation feedback, which undermines the process and public trust in the judiciary. These judges’ perspectives on JPE present a vision of a fairer, more effective, and more useful evaluation process.”

Later in 2024, IAALS will release new, comprehensive recommendations for improving and modernizing judicial performance evaluation nationwide. In addition to this survey of judges, IAALS has convened a task force to guide this work and has tapped nearly 100 diverse, national experts and stakeholders to assess and reinvigorate JPE from all perspectives and for all involved—the judges, the program administrators, the attorneys, and the public.

“The JPE 2.0 project represents a reimagining of the judicial performance evaluation process for the 2020s and beyond,” said Jordan Singer, Professor at New England Law │Boston and Chair of the task force.  “Including the judiciary’s perspectives through this survey will provide a richer and more robust set of recommendations.”

Dive Deeper