By Lindsay Whitehurst

The trial of a Fugees rapper, who was convicted this year in multimillion-dollar political conspiracies, stretched across the worlds of politics and entertainment — and now the case is touching on the tech world with arguments that his defense attorney bungled the case, in part, by using an artificial intelligence program to write his closing arguments.

Prakazrel “Pras” Michel argued that use of the “experimental” generative AI program was one of a number of errors his previous attorney made a trial for which he was “unqualified, unprepared and ineffectual," according to a motion for new trial his new lawyers filed this week. The company behind the program, on the other hand, said it was a tool used to help write closing statements, and a harbinger of major changes in the field.

Generative AI programs are capable of creating realistic text, images and video. They're raising tough questions about misinformation and copyright protections as well as industry calls for regulations in Congress. Programs like ChatGPT have already had ripple effects across professions like writing and education. The arguments in the Michel case could be a preview of issues to come as the technology makes a rapid advance.

The Grammy-winning rapper's trial was touted as the first time generative AI was used during closing statements in a news release from the startup company that designed the system. Defense attorney David Kenner, well known for his previous representation of rappers like Suge Knight and Snoop Dogg, also gave a quote calling the system a “game changer for complex litigation."

But in his last words to the jury, Kenner appeared to mix up key elements of the case and misattributed the lyric “Every single day, every time I pray, I will be missing you,” to the Fugees, the 1990s hip-hop group his client co-founded, when actually it is a well-known line from a song by the rapper Diddy, then known as Puff Daddy, court documents from Michel’s new attorney, Peter Zeidenberg, stated.

Kenner did not respond to a phone call and email seeking comment from The Associated Press. The company, EyeLevel.AI, said the program wasn't “experimental” but instead trained using only facts from the case, including the transcripts from the previous day in court, not musical lyrics or anything found online. It's intended to provide fast answers to complex questions to help, not replace, human lawyers, said co-founder and COO Neil Katz.

“We think AI technology is gong to completely revolutionize the legal field by making it faster and cheaper to get complex answers to legal questions and research,” Katz said.

He denied an allegation from Michel's new lawyers that Kenner appeared to have a financial stake in the company.

Michel was found guilty in April on all 10 counts he was charged with, including conspiracy and acting as an unregistered agent of a foreign government. He faces up to 20 years in prison on the top counts. He is free ahead of sentencing, which has not yet been set.

“At bottom, the AI program failed Kenner, and Kenner failed Michel. The closing argument was deficient, unhelpful, and a missed opportunity that prejudiced the defense,” wrote Zeidenberg. His other arguments for a new trial included the jury being allowed to hear references to the “crime fraud exception” and “co-conspirators."

Michel was accused of funneling money from a now-fugitive Malaysian financer through straw donors to Barack Obama’s 2012 reelection campaign, then trying to squelch a Justice Department investigation and influence an extradition case on behalf of China under the Trump administration. His trial included testimony ranging from actor Leonardo DiCaprio to former U.S. Attorney General Jeff Sessions.

Kenner had argued during the trial the Grammy-winning rapper simply wanted to make money and got bad legal advice as he reinvented himself in the world of politics.

It wasn't immediately clear when a judge might rule on the motion for new trial.

The legal profession in general has not yet been deeply affected by generative AI, but that could change in a big way as products improve, said John Villasenor, a professor of engineering and public policy at the University of California, Los Angeles. The American Bar Association does not yet have any guidelines on the use of AI in the legal profession, though there is a new task force studying the issue, a spokeswoman said.

Villasenor was not aware of any generative AI tools now that could produce strong closing arguments since they depend on so many complex factors that develop over the course of a trial. Generative AI also sometimes produces “hallucinations,” statements that initially read as if they are accurate but are not.

“A good attorney coming up with closing arguments will be mindful of basic goals of the case but also of the specific ways in which the trial has played out,” he said. Even as products improve, “attorneys that use AI should make sure they very carefully fact check anything they are going to use.”

Share:
More In Culture
Are We Done With The Aperol Spritz?
Bacardi's Director of Lifestyle & Culture, Colin Asare-Appiah makes us a hugo spritz, chats summer drink trends and if AI will ever replace bartenders.
Load More