Associated Incidents
During the first quarter of this year, the Supreme Court received a procedural document with so-called hallucinations, i.e. fabricated information. This is what Supreme Court Justice Toril Marie Øie tells Advokatbladet.
The procedural document, which was filed in connection with an appeal over a procedural issue in a civil case, contained both laws and preparatory documents that simply do not exist, Øie says.
That these were fabricated sources was discovered internally when the appeal was reviewed by the Supreme Court.
The investigators who prepare the cases that are to be considered by the Supreme Court routinely review the central legal sources referred to in procedural documents from the parties.
-- The procedural document both referred to various legal sources and reproduced quotes from the sources. When our investigator tried to find these sources, it turned out that the sources did not exist. Here, the lawyer had simply not checked the sources from KI before the procedural document was filed, says Øie.
-- Surprised
It is not the first time that artificial intelligence hallucinates. Fictitious legal sources have already occurred before the court in the USA. At the end of March, it was also revealed that a report on the school structure in Tromsø contained fictitious sources.
This is still the first time that the hallucination issue has come to the fore in the Supreme Court of Norway.
Øie says that she was surprised by the incident:
- I was surprised that there has apparently been no control of the sources, she says.
Sanctions may be applicable
-
What consequences has the incident had for the lawyer responsible?
-
When the case was discovered, we raised the matter with the lawyer in question. The Supreme Court has the opportunity to impose a procedural fine or bring cases before the Bar Council. These remedies have not been used in this case, but I cannot rule out that in other cases in the future, says Øie.
She says that the Supreme Court is currently concerned with emphasizing the responsibility that lawyers have.
-- Lawyers are different and have different apparatuses around them. Not all of them work in large firms with a large support system and a lot of guidance internally. The important thing for us now is therefore to provide good information and to strongly emphasize to the lawyers what our expectations are, says Øie.
- If procedural documents with hallucinations become a problem in the future, we must consider using the sanction options we have. I now have an expectation for the lawyers that this will not be a problem in the future, she continues.
Artificial intelligence does not exempt from liability
Øie believes that artificial intelligence can be an appropriate tool, but at the same time reminds us of the responsibility lawyers have if they choose to use it.
-- The Supreme Court is aware that artificial intelligence is here to stay, and that it can be a useful tool for lawyers. At the same time, lawyers must be aware of their responsibility. Lawyers have the same responsibility for the information of the case, both factually and legally, whether they use artificial intelligence or not.
She points out that the knowledge of hallucinations is well-known, and that this problem is something lawyers must be aware of.
- We have a clear expectation that you only use AI in areas where you yourself have enough expertise to check that the content is correct. The lawyer must, in the usual way, ensure that key sources are presented correctly, both with regard to the fact that they actually exist, and that the content that is reproduced is correct.
Changes to the Lawyers' Guide
On Thursday, the Supreme Court will publish an updated version of the Lawyers' Guide with a new point on the use of AI tools.
The guide is intended to prepare lawyers for appeal proceedings in the Supreme Court.
--We want to convey what expectations we have for lawyers who will use AI in the future. The Lawyers' Guide is our most important channel for communicating with lawyers. We know that it is well used by lawyers who appear before the Supreme Court and the lower courts. That is why we are updating it now.
The new point states, among other things, that "it is known that AI can hallucinate and invent, among other things, legal sources. Sources and other information provided by AI must be quality assured. AI should only be used in areas where the lawyer has sufficient competence to control the content".
- Shouldn't come as a surprise
Leader of the Norwegian Bar Association, Siri Teigum, is not surprised by the issue.
- It was expected that something like this would happen in Norway at some point, but I am very surprised that it is happening in a procedural document to the Supreme Court. It is really unfortunate, she says.
Teigum is positive that the Supreme Court has now updated the legal guide on this point, but specifies that the content should not come as a surprise to lawyers.
- The same requirements for quality and ethics apply, regardless of whether lawyers use artificial intelligence or not. I think it is positive that the Supreme Court mentions this in its guide, but this is something that a lawyer should understand even without such clarification.
Teigum says that the Norwegian Bar Association is working to ensure that this does not happen again.
- The Norwegian Bar Association wants to contribute to ensuring that this does not happen again. We have regular courses on AI, and the Norwegian Bar Association's Committee on Legal Ethics is currently preparing a guide on the use of artificial intelligence.