Lawyer Uses ChatGPT To Meet Deadlines, Loses Job After AI Tool Creates Fake Cases

Jahanvi Agarwal

ChatGPT has been a prominent topic of conversation over the past year, especially in 2023, owing to the rapid advancements in generative artificial intelligence. The utilization of AI tools like ChatGPT and Google Bard has become increasingly straightforward for age-appropriate users, providing a convenient means to engage with these AI applications. However, the ethical question arises – should these tools be employed to streamline work processes or meet tight deadlines? While the case of Zachariah Crabill might suggest a cautious approach, it’s a subject worthy of thorough discussion.

Crabill, a 29-year-old attorney, faced professional repercussions after incorporating ChatGPT into his workflow at the Baker Law Group in Colorado, USA. The circumstances leading to a lawyer resorting to ChatGPT are noteworthy, as Crabill expressed feeling overwhelmed by approaching deadlines and mounting stress amid a surge in workload assigned by his superiors in May. Seeking to cope with this heightened pressure, he turned to ChatGPT as a research tool, using it to support a motion he had drafted.

In Crabill’s words,

“When ChatGPT saved me hours of work, it was a tiny ray of sunlight in an otherwise abysmal situation. My experience is not unique; sadly, I’ve heard many attorneys say they too were ‘thrown to the wolves’ early in their career.”

While ChatGPT successfully generated the motion, the oversight occurred when Crabill failed to thoroughly review the bot’s response. This lapse led to the unintentional inclusion of fictional lawsuit citations, resulting in Crabill’s termination.

Despite this setback, Crabill remains a believer in the potential of generative AI to enhance lawyers’ productivity. In fact, he has ventured into a startup that provides AI-assisted legal services. He asserts that ChatGPT has become an integral tool in his daily professional life, akin to Google, enabling him to perform his job more efficiently.