Navigation

© Zeal News Africa

AI Firm Anthropic Faces $1.5 Billion Author Lawsuit Settlement

Published 3 weeks ago2 minute read
David Isong
David Isong
AI Firm Anthropic Faces $1.5 Billion Author Lawsuit Settlement

AI company Anthropic has agreed to a landmark settlement of $1.5 billion to resolve a class-action lawsuit filed by a group of authors who accused the company of using their copyrighted books to train its artificial intelligence chatbot, Claude, without obtaining proper permission. The agreement, announced in August without specific terms, was disclosed in a court filing on Friday by Anthropic and the plaintiffs, who have requested U.S. District Judge William Alsup to approve the settlement.

This proposed deal is significant as it marks the first settlement in a series of ongoing lawsuits against major technology companies, including OpenAI, Microsoft, and Meta Platforms. These lawsuits collectively allege that these companies utilized copyrighted material to develop and train their generative AI systems without authorization. The plaintiffs in this case, writers Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson, filed their class action against Anthropic last year, arguing that the Amazon and Alphabet-backed company unlawfully used millions of pirated books to educate its AI assistant, Claude, on how to respond to human prompts.

The allegations by these writers resonate with numerous other lawsuits brought by various creators, including authors, news outlets, and visual artists, all claiming that tech companies exploited their creative works for AI training. In defense, tech companies have consistently argued that their systems make fair use of copyrighted material to generate new, transformative content. Judge Alsup previously ruled in June that Anthropic's use of the authors' work for training Claude did constitute fair use. However, he also found that the company had violated the authors' rights by storing over 7 million pirated books in a

Recommended Articles

Loading...

You may also like...