Grammarly Halts AI Feature That Imitated Writers, Faces Multimillion-Dollar Lawsuit
Grammarly, the popular writing assistance tool, has deactivated a controversial artificial intelligence feature known as Expert Review after significant public backlash and a class-action lawsuit. The feature utilised generative AI to produce editing suggestions that were purportedly inspired by the styles of well-known authors and academics, including novelist Stephen King, astrophysicist Neil deGrasse Tyson, and the late scientist Carl Sagan.
Legal Action and Allegations of Unauthorised Use
A lawsuit has been filed in the southern district of New York against Superhuman, Grammarly's parent company, seeking damages exceeding $5 million (£3.7 million). The legal claim argues that using individuals' names for commercial gain without permission is unlawful, highlighting concerns over identity monetisation without consent.
Investigative journalist Julia Angwin, who is listed as a featured expert in the software, serves as the lead plaintiff in the case. In an interview with the BBC, Angwin expressed shock, stating, "Editing is a skill ... it's my livelihood, but it's not something I've ever thought about anyone trying to steal from me before. I didn't even think it was steal-able." Her lawyer, Peter Romer-Friedman, reported that over 40 people have expressed interest in the lawsuit within 24 hours of its filing.
Public Outcry from Affected Writers and Academics
Since the feature gained public attention, several writers and academics have voiced their disapproval. Tech journalist Casey Newton, another individual included in the tool, criticised Grammarly's approach, writing, "[Grammarly] curated a list of real people, gave its models free rein to hallucinate plausible-sounding advice on their behalf, and put it all behind a subscription. That's a deliberate choice to monetise the identities of real people without involving them, and it sucks."
Vanessa Heggie, an associate professor at the University of Birmingham, also condemned the inclusion of fellow academic David Abulafia, who passed away in January, describing it as "obscene" in a LinkedIn post.
Grammarly's Response and Future Plans
In response to the controversy, Shishir Mehrotra, chief executive of Superhuman, issued an apology on LinkedIn. He acknowledged the critical feedback, writing, "Over the past week, we received valid critical feedback from experts who are concerned that the agent misrepresented their voices. We hear the feedback and recognise we fell short on this. I want to apologise and acknowledge that we'll rethink our approach going forward."
Mehrotra further told the BBC that the Expert Review feature was taken down for a redesign prior to the lawsuit being filed and had minimal usage during its short lifespan. Despite the apology, he asserted that the legal claims are "without merit" and that Superhuman will "strongly defend against them."
Background and Evolution of Grammarly
Launched in 2009 as a basic spelling and grammar checker, Grammarly expanded its offerings last year by introducing a suite of generative AI features, including the now-disabled Expert Review. According to a company blog post, the feature was designed to "offer subject-matter expertise and personalised, topic-specific feedback to elevate writing that meets rigorous academic or professional standards tailored to the user's field."
This incident underscores ongoing debates about AI ethics, intellectual property, and the boundaries of using real identities in technology without explicit consent.
