How NOT to Use AI in Your Legal Practice
The case of U.S. v. Prakazrel Michel, and how a lawyer used AI to lose a multimillion dollar case.
What over-worked, over-stressed attorney would not be enticed by an AI system that would turn hours or days of work into a few minutes? This is every lawyer’s dream, right? To type a few simple prompts into your AI model and have a document with an expertly-crafted, razor-edged argument, or an air-tight contract or brilliant demand letter. The only thing that would make this scenario better is if the lawyer could bill for the time it would have taken to draft the legal document. Well, and maybe a financial interest in the AI model you’re using and that is being marketed to other lawyers, so you get that passive income. But as your grandmother told you, anything that seems too good to be true, probably is. Rather than use an AI system to make his job easier, attorney David Kenner used it in a poorly camouflaged attempt to cover for his failure to understand and make appropriate arguments in a multimillion dollar criminal case where he was representing rapper Prakazrel Michel.
On October 16, 2023 defendant Prakazrel “Pras” Michel filed a motion for new trial in the U.S. District Court for the District of Columbia. The motion alleged a new trial was warranted for, among other things, ineffective assistance of counsel based in part on his attorney’s use of AI. In the underlying case Michel was charged with being involved in a conduit scheme that inappropriately funneled money to a presidential re-election campaign, and then later (by way of a superseding indictment) he was also charged with being part of a lobbying scheme which violated the Foreign Agents Registration Act.
ormer member of the hip hop group the Fugees this case was sure to have notoriety, making it a great case to advertise his AI product aimed at attorneys.
Second, the motion alleges that if Kenner’s use of CaseFile was a performance test for CaseFile, it failed the test miserably. Michel argues that CaseFile wrote such a poor closing argument as to amount to ineffective assistance of counsel. The closing argument misstated both law and facts, and left out any legally viable argument of innocence.
Michel’s Motion for New Trial concludes, "As demonstrated above, Kenner and Israely’s decision to elevate their financial interest in the AI program over Michel’s interest in a competent and vigorous defense adversely affected Kenner’s trial performance, as the closing argument was frivolous, missed nearly every colorable argument, and damaged the defense. . . [and] therefore warrants a new trial.”[3]
So What Can We Learn From This Case?
One of the lessons to be learned from this case is certainly that you should not promote your own unproven AI system that you just used in a case you lost. But that should have been self-evident to any proficient litigation attorney (and likely any first year law student). The more important issue for us is whether this is a cautionary tale for the use of any AI in a legal case.
Kenner didn’t merely fail to win. He chose to substitute the product of an untested AI system for his own professional judgment and experience. Perhaps he was so enamored with the possibilities of CaseFile And EyeLevel.AI that he failed to see just how damaging the AI-produced closing argument was to his client. This amounts to malpractice, and a violation of his professional responsibility duties owed to his client.
Legal malpractice is a state law cause of action, but generally all claims for malpractice include the same or similar elements:
The attorney negligently failed to exercise the care, skill, and diligence commonly possessed and exercised by a typical lawyer;
The attorney’s negligence was both a proximate or legal cause, and a factual or “but for” cause of the harm sustained by the client; and
The client was harmed by the actions or inactions of the attorney.
In Michel’s case, Kenner’s representation met all of those standards. He failed to make meaningful arguments. For example he argued that it was another person who inappropriately lobbied the U.S. Government, and that the lobbying was ultimately unsuccessful but neither of these is helpful in a conspiracy charge. Further, Kenner presented a case that was lacking in potentially exculpatory arguments. As a result Michel was convicted on all counts, which clearly amounts to harm. Although the use of the AI system was not Kenner’s only failing, it played a substantial role.
In addition, Kenner’s use of the AI system violated his professional duties to his client. Model Rule 1.1 of the Model Rules of Professional Responsibility requires that a lawyer provide competent representation to the client. In this case Kenner allowed his self-interest to overshadow his duty of competent representation of his client. Likewise he failed to act with reasonable diligence in violation of Model Rule 1.3, because he did not realize that the argument produced by the AI system was severely lacking. He substituted the AI’s work for his many years of practice.
Model Rule 3.1 requires that a lawyer have a basis in law and fact for any assertion or defense used in a case. The closing argument had both factual errors and legal errors. If Kenner had known his case well enough, he would have spotted these errors and corrected them.
Michel likely has good claims of malpractice against at least Kenner, if not Kenner and Israely, and a valid complaint to the D.C. Bar.
How Could the Lawyer Have Appropriately Used AI in This Case?
Certainly there is a place for AI in the practice of law. For example, Kenner could have asked the AI system to give him a summary of one of the laws at the center of this case – FARA. He could have also had the AI system summarize motions, and summarize case law. And he could have fed discovery documents into the system and had it summarize those as well.
We will continue to provide articles containing ways to use AI. As mentioned above, it is critical that we never allow a computer system as replace or even to cloud our professional judgment. You as the lawyer will continue to have the duty to the client, not the AI system.
[1] CaseFile Connect was a “technology partner” to the AI system Kenner used: EyeLevel.AI. https://www.swanerlaw.com/us-v-michel See also https://nypost.com/2023/10/17/lawyer-representing-hip-hop-star-accused-of-using-ai-in-major-case-he-lost/
[2] First Use of AI in Federal Trial: EyeLevel’s Litigation Assist Aids Defense in Pras Michel Fraud Case, https://www.eyelevel.ai/post/first-use-of-ai-in-federal-trial;
[3] Motion for New Trial can be found here in its entirety. https://www.swanerlaw.com/us-v-michel