According to The Guardian, a judge has imposed a $5,000 fine on New York attorney Steven Schwartz, his colleague Peter LoDuca, and their law firm Levidow, Levidow and Oberman, for relying on ChatGPT to discover and review case citations. Schwartz had employed the AI tool for a lawsuit filed against Colombian airline Avianca, alleging injuries sustained during a flight to New York City. However, ChatGPT provided six legal precedents, including “Martinez v. Delta Airlines” and “Miller v. United Airlines,” that were either non-existent or inaccurately represented.
Judge P Kevin Castel, in justifying the penalty against Schwartz and his associates, stated, “Technological advancements are commonplace, and it is not inherently improper to employ a dependable artificial intelligence tool for assistance. However, existing regulations impose a gatekeeping responsibility on attorneys to ensure the accuracy of their submissions.” Essentially, lawyers are permitted to utilize ChatGPT or similar tools but must verify the veracity of the information provided. By neglecting to do so, the attorneys had “abdicated their obligations,” even when questioned by the court about the legitimacy of the fabricated references.
Instances of inaccuracies stemming from ChatGPT and other AI chatbots are widespread. For instance, the National Eating Disorder Association’s chatbot offered dieting advice to individuals in recovery from eating disorders, while ChatGPT falsely accused a law professor of sexual assault by presenting a nonexistent article from The Washington Post as evidence.
Frequently Asked Questions (FAQs) about legal source verification
Q: What were the lawyers fined for regarding their use of ChatGPT?
A: The lawyers were fined $5,000 for including fake case citations generated by ChatGPT in their legal filings.
Q: Which law firm and attorneys were penalized for their reliance on ChatGPT?
A: The law firm Levidow, Levidow and Oberman and attorneys Steven Schwartz and Peter LoDuca were fined for their use of ChatGPT.
Q: What was the case where ChatGPT’s generated citations were found to be inaccurate or nonexistent?
A: The case involved a man suing Colombian airline Avianca for injuries sustained on a flight to New York City.
Q: What explanation did the judge provide for imposing the fine on the lawyers?
A: The judge stated that while using AI tools like ChatGPT for assistance is acceptable, attorneys have a responsibility to ensure the accuracy of their filings and should verify the claims made by such tools.
Q: Are there other instances of inaccuracies or misuse of AI chatbots mentioned in the text?
A: Yes, the text mentions examples such as a chatbot from the National Eating Disorder Association providing dieting tips to individuals in recovery and ChatGPT falsely accusing a law professor of sexual assault using a nonexistent article as evidence.
More about legal source verification
- The Guardian: US lawyers fined $5,000 after including fake case citations generated by ChatGPT: The original article reporting on the lawyers’ fine for using ChatGPT.
- National Eating Disorder Association: The official website of the National Eating Disorder Association mentioned in the text.
6 comments
This just goes to show how important it is to double-check your work, even if you’re using AI tools. The fines seem justified because they neglected their responsibilities as attorneys. Lesson learned, I guess.
Can’t believe they relied on chatgpt for their legal filings. Like, seriously? They had one job, to verify their sources. This is a major facepalm moment for those lawyers.
It’s fascinating how AI chatbots like ChatGPT can generate case citations, but accuracy is key! These lawyers should’ve been more cautious and taken the time to fact-check. AI is a powerful tool, but human oversight is crucial.
The judge rightly emphasized the lawyers’ duty to ensure the accuracy of their submissions. Technology can assist, but lawyers must be vigilant. This case highlights the need for ethical and responsible use of AI in the legal profession.
wow, so the lawyers got fined for using chatgpt to include fake case citations. it’s like they didn’t even check their sources. should have known better, man!
Test