Technology
Meta defends itself against a copyright lawsuit from an author over AI use.
Published On Thu, 26 Jun 2025
Ronit Dhanda
0 Views

On June 25, a U.S. federal judge sided with Meta Platforms in a copyright case brought by a group of authors who claimed the company had illegally used their books to train its artificial intelligence system without permission. Judge Vince Chhabria, based in San Francisco, ruled that the authors failed to provide sufficient evidence showing Meta’s AI harmed the market for their works, which is necessary to prove a copyright violation.
However, Chhabria clarified that using copyrighted content without authorization to train AI could still be illegal in many situations. He criticized the authors’ legal strategy, noting that the ruling doesnt mean Meta’s use was lawful—only that these plaintiffs failed to make a strong enough case. The authors legal team, Boies Schiller Flexner, expressed disagreement with the ruling, accusing Meta of unprecedented misuse of copyrighted material. Meta welcomed the decision, emphasizing the importance of fair use in developing transformative AI technologies.
The lawsuit, filed in 2023, accused Meta of using pirated versions of the authors books to train its AI model, Llama, without paying or seeking permission. This case is one of several similar lawsuits filed by authors, media outlets, and other copyright holders against AI developers like OpenAI, Microsoft, and Anthropic. Fair use, a legal principle allowing limited use of copyrighted content without permission, is central to the defense of AI companies. They argue their models transform existing content to create new works and that restricting such use could hurt AI innovation. While sympathetic to concerns from copyright holders, Judge Chhabria acknowledged the risk that generative AI could overwhelm the market with massive amounts of content created with minimal human input. This, he warned, could weaken incentives for original human creativity in the long run.
Disclaimer: This image is taken from Reuters.