Search

Judicial evaluations aim to preserve fair, impartial and accountable courts

Issue January 2007

Once again, Massachusetts can be proud of its judicial report card.
In March 2005, the MBA launched an online judicial performance evaluation system that enables lawyers across the commonwealth to assess the performance of the state’s judges. The goal of the initiative is to identify success and shortcomings among the state’s judges and to encourage educational and enhancement programs for the judiciary.

Participants assess individual judges in 19 performance areas, including impartiality, knowledge, punctuality, preparedness, communication skills, courtesy and temperament.

According to Judiciary Evaluation Committee Chairman Edward W. McIntyre, "Overall, the results are exceptionally favorable to the judiciary," with 98 percent of the Superior Court judges receiving "exceptional" ratings during the most recent rating period.

Every judge who has been evaluated has received his or her evaluation — the actual surveys, not a compilation of data. At the same time, each chief justice also received a copy of the same evaluation form for judges working in his or her department.

Chief Justice for Administration and Management Robert A. Mulligan is also given the data on a regular basis regarding the evaluations submitted.

To ensure confidentiality and to avoid the leak of an evaluator’s name, the MBA uses an anonymous system: judges’ names and attorneys’ names are all encrypted when entered and processed through the system. The mailing is anonymous as well; forms are put in envelopes marked by numbers rather than names.

But if the judicial evaluation committee finds a particular judge’s number comes up with a "seriously deficient" rating, then the committee would ask the MBA general counsel for the key to decrypt the identities. They would then contact the judge and the chief justice of that trial department.

A slight modification has been enacted since the program’s inception to further protect the anonymity of participants: lawyers are no longer required to state a specific date or event in their evaluation process that would allow a judge to potentially identify the lawyer submitting the evaluation.

According to McIntyre, while the Superior Court and Juvenile Court judges were rated exceptional, the Trial Court departments that handle the most emotionally charged issues were evaluated more severely than the Superior Court.

Information from Chief Justices Mulligan, District Court Chief Justice Lynda M. Connolly and Probate and Family Court Chief Justice Sean M. Dunphy indicates that the MBA’s results mirror the data from the court’s own evaluation process. "Their data shows the judges are performing exceptionally in some of the Trial Court departments, and in other Trial Court departments, not as well. But overall, the judiciary is performing very well in the commonwealth according to both systems," said McIntyre.

Supreme Judicial Court’s evaluation procedure

Since 2001, the Supreme Judicial Court has conducted its own judicial performance program in the Trial Court. Attorneys, jurors and court employees assess judges’ work performance in areas such as demeanor, legal knowledge, courtroom management and timeliness of decisions through written questionnaires. All questionnaires are confidential and do not request the names of respondents. The resulting reports are also confidential and are given only to the judge being evaluated and to the appropriate chief justices to review with the evaluated judge.

Since the inception of the program in 2001, an evaluation of judges has been conducted in all the counties in the commonwealth, with judges now being evaluated for the second time.

Superior Court Judge Janet L. Sanders, who is chairperson of the Judicial Performance Evaluation Committee, was pleased with what she termed the "validity" of the SJC’s evaluation program. According to Sanders, the first round of evaluations under the SJC’s program included responses from 5,800 attorneys, for a 30 percent response rate. Each judge was evaluated, on average, by 126 attorneys. It was not considered to be a valid response if 25 lawyers or less evaluated a judge. Out of 300 trial judges, less than 10 had that few responses.

Employees and jurors were also given an opportunity to evaluate the judges, although the response rate for employees was lower than that for attorneys, and jurors’ responses tended to be less substantive, according to Sanders.

While the committee does not see the individual results, "the responses are overwhelmingly positive. A large number of our judges do well in these evaluations," said Sanders.

Although some judges had misgivings about the SJC evaluation program when it started, Sanders said several judges have commented that they have learned a great deal about their performance and have taken affirmative steps to change the way they do business.

"It has had a positive impact. We all want to do the best job we can. We’re professionals, so we take these evaluations seriously. We have all come to accept it as a necessary measure of accountability on our parts," said Sanders.

In response to judicial feedback that requested more constructive information, the SJC evaluation survey has been slightly modified since its inception. The survey now includes demographic information as to the respondent’s number of years in practice, what percentage of the attorney’s practice is in litigation, and how much time the attorney has spent appearing before the judge being evaluated.

In addition, the evaluation now asks for a narrative response to explain both low and high ratings, as well as requesting information as to whether a particular practice of the judge had any effect on the ranking. Finally, the evaluation has a question about whether there has been any change in a judge’s performance since his or her last evaluation.

One advantage the SJC evaluation program has over the MBA’s program is that the SJC can use Trial Court-generated lists to send targeted mailings of questionnaires to lawyers who have appeared at least twice before the judge being evaluated. The MBA does not have the ability to target respondents.

In addition, the SJC has the means to ensure a lawyer can evaluate a judge only once. "We take precautions so that the forms can’t be duplicated; you can’t stuff the ballot box to skew the results," said Sanders.

Confidential v. public evaluation

Despite two constructive judicial evaluation programs already in place in Massachusetts, Lawyers Weekly has launched a third program with a distinguishing aspect that is creating debate among the bench and bar: Lawyers Weekly’s evaluations of individual judges will be made public.

According to David L. Yas, publisher and editor-in-chief for Massachusetts Lawyers Weekly, the primary motivating factor in publishing the evaluations of individual judges is to make lawyers better informed, in keeping with the paper’s mission.

"This state does not elect its judges; currently there is no public evaluation system, no retention system, and no term-limits for judges. This state has very strict rules for judges to defend themselves when attacked publicly. They can barely say a word. It adds up to an uncomfortable situation with the public, the media and the public’s perception of judges," said Yas.

Yas predicts the Lawyers Weekly project will get more information to the public, to let judges defend themselves and to show the public in a tangible, definitive way that judges are doing excellent work.

A primary concern of those who oppose publication of individual results is whether other media will misrepresent the results.

Recent history has demonstrated to McIntyre that secondary media cull information from a legitimate source’s data "and it ends up as fodder for particular columnists in the Boston media," he said. "We have concerns as to what happens to the data as it is further translated for popular consumption. We never want our evaluations to have a chilling effect upon a judge doing his or her job."

"Rightly or wrongly, the media, and therefore the public, tends to focus on the negative rather than the positive," said Sanders. "When I say our results were overwhelmingly positive, the story won’t be about the 90 percent who did well; it will focus on the negative. Nasty comments that are more colorful and interesting; that will get the attention. That kind of focus doesn’t promote public confidence in the judiciary and it is important that the public have confidence in this institution."

Yas concedes that the public gets its information about judges from the mainstream media and quite often those reports are negative. "There are too many citizens who only hear about judges when those judges are being criticized. That’s a shame," said Yas. "It’s a function of this uncomfortable situation where judges can’t defend themselves and point to any definitive measure of their success."

But most people’s job performance evaluations, positive or negative, are not posted on the Internet for everyone to see. Why should judges’ evaluations be posted?

"We don’t sign up to serve the public," said Yas. "For public servants doing a job as important as they do, it’s something they should expect."

"It doesn’t matter whether you are a public or private employee," countered Sanders. "It is to improve the employee’s performance. [Negative evaluations] have a detrimental effect on public confidence and in achieving the purpose you’re trying to achieve — to improve performance, not shame them into changing."

"We’ve taken the time to develop a fair survey. We are committed to handling the information in a responsible way," said Yas. "If certain sub par judges — and there are very few of them — but if they get called out for that performance, we don’t see anything wrong with that; it will encourage the judge to improve his performance."

McIntyre would rather rely on the court’s own enhancement programs to improve the judiciary, including videotaping and mentoring. "There are other means of achieving the goal of remediating aberrant behavior; publicizing a judge is not the way to remediate behavior or performance."