UK Law Society Gazette Opinion Piece: Lawyers: beware AI’s hallucinations

ew Zealand Law Society’s recent weekly update of news had a story which should concern us all. The Law Society’s library has been receiving requests from lawyer members for cases cited by ChatGPT, the much-discussed AI chatbot. The cases are packaged to look like real cases, with proper citations – the technology has learned what case names and citations look like. The only problem is: the cases don’t exist.

It is called hallucination when AI invents facts with total confidence. The outputs sound plausible, but are utter inventions. It seems that the current forms of AI chatbot, so over-hyped, are much given – like 60s hippies tripping out – to frequent bouts of hallucination.

Examples crop up in the news all the time now.

For instance, one UK broadsheet newspaper reports that it is beginning to receive requests for archived material that it cannot provide – because the articles, cited by ChatGPT, do not exist.

Or there is the case of the US law professor who was falsely accused of sexually harassing students by a chatbot, based on a newspaper article which did not exist (never mind that he did not teach at the university cited, nor had ever been on the trip where the misconduct was alleged to have taken place).

There are various reasons, apparently, for the hallucinations: how the AI is trained, the data to which it has access, problems with encoding, or biases. But that doesn’t matter to us lawyers – that is for AI engineers to resolve. We need to know only that it happens, and on an increasingly frequent scale. (I have personal experience, when in answer to a question I posed to ChatGPT, it confidently cited research studies which did not exist.)

Regulation is coming. The Italian data protection authority last week put a temporary ban on ChatGPT because of potential infringement of EU data privacy laws. Complaints are arising in other EU member states, too, and there is talk of action at EU level. AI companies need a legal basis to collect and use personal data, must be transparent about how they use the data, plus keep personal data accurate and give people a right to correction. With so much hallucination, how will that work?

Read more 

https://www.lawgazette.co.uk/commentary-and-opinion/lawyers-beware-ais-hallucinations/5115682.article?utm_source=gazette_newsletter&utm_medium=email&utm_campaign=Family+lawyers+identify+%27harmful%27+words+%7c+Dual+discount+rate+%27piles+pressure%27+on+injured+%7c+AI+hallucinations_04%2f11%2f2023