HomeArtificial IntelligenceWe asked ChatGPT for legal advice – listed here are five explanation...

We asked ChatGPT for legal advice – listed here are five explanation why you shouldn’t

At some point in your life, you’ll likely need legal advice. Opinion poll A 2023 survey conducted by the Law Society, Legal Services Board and YouGov found that two-thirds of respondents had experienced legal problems previously 4 years, with probably the most common problems regarding employment, funds, advantages and consumer issues.

But not everyone can afford legal advice. Of the survey respondents with legal problems, only 52% received skilled help, 11% received support from other people akin to family and friends, and the remaining received no help in any respect.

Many people search the web for legal help. And now that now we have access to artificial intelligence (AI) chatbots like ChatGPT, Google Bard, Microsoft Co-Pilot, and Claude, you could be fascinated with asking them a legal query.

These tools depend on generative AI that generates content when presented with a matter or instruction. They can explain complicated legal information quickly and easily, but are they accurate?

We put chatbots to the test in a recently published study International Journal of Clinical Legal EducationWe entered the identical six legal questions on family, employment, consumer and housing law into ChatGPT 3.5 (free version), ChatGPT 4 (paid version), Microsoft Bing and Google Bard. The questions were often asked to us in our free online legal consultation on the Open University Law School.

We've found that while these tools can provide legal advice, the answers aren't all the time reliable or accurate. Here are five common mistakes we've noticed:

1. Where does the law come from?

The chatbots' initial responses were often based on American law. This was often not explicitly mentioned or was not obvious. Without legal knowledge, the user would likely assume that the law of their place of residence applies. The chatbot sometimes didn’t explain that the law differs depending on where you reside.

This is especially complex within the UK, where Laws differ between England and Wales, Scotland and Northern Ireland. For example, the Act on rent a house in Wales differs from Scotland, Northern Ireland and England, while Scottish And English dishes have different procedures for coping with divorce and the termination of a civil partnership.

When obligatory, we asked a follow-up query: “Is there an English law that covers this problem?” We had to make use of this instruction for many questions and the chatbot then provided a solution based on English law.

2. Outdated law

We also found that the reply to our query sometimes referred to outdated laws which were replaced by recent laws. For example, the Divorce law modified in April 2022 the abolition of fault-based divorce in England and Wales.

Some responses referred to the old law. AI chatbots are trained using large amounts of knowledge – we don't all the time know the way current the info is, so that they may not give you the option to take note of the newest legal developments.

Consulting with a lawyer might be a greater option than using AI, if you’ve got access to it.
Redpixel.pl/Shutterstock

3. Bad advice

We found that the majority chatbots gave incorrect or misleading advice on questions on family and employment. The answers to the housing and consumer questions were higher, but there have been still gaps within the answers. Sometimes they missed really essential facets of the law or explained it incorrectly.

We found that the AI ​​chatbots' responses were well-written, which could make them seem more convincing. Without legal knowledge, it is extremely difficult for somebody to evaluate whether a solution is correct and applicable to their individual circumstances.

Although this technology is comparatively recent, there have been cases where people have relied on chatbots in court. In a civil case in Manchester, a plaintiff who represented himself in court reportedly presented fictitious legal cases to support their argument. They said they used ChatGPT to search out the cases.



4. Too general

In our study, the responses weren’t detailed enough to know the legal problem and the best way to solve it. The responses provided information on a subject without specifically addressing the legal issue.

Interestingly, the AI ​​chatbots were higher at suggesting practical, non-legal solutions to an issue. While this might be useful as a primary step to solving an issue, it doesn't all the time work and legal motion could also be obligatory to implement your rights.

5. Pay to play

We found that ChatGPT4 (the paid version) is best overall than the free versions, which risks further increasing digital and legal inequality.

Technology is evolving and there may come a time when AI chatbots are higher capable of provide legal advice. Until then, people need to concentrate on the risks they take when solving their legal problems. Other sources of help akin to Citizens’ advice provides current and accurate information and is best capable of help.

All chatbots answered our questions, but explained of their responses that it was not their job to offer legal advice and beneficial in search of skilled help. After conducting this study, we recommend doing the identical.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read