Monday, March 16, 2026

To Be Divine

Superhuman Platform, Incorporated, the company formerly known as Grammarly, is facing a class action lawsuit over a feature it rolled out at the end of the Summer called Expert Review. Expert Review, which was recently removed, was effectively a "this person would make these suggestions about what you're writing," sort of feature, and claimed to offer advice from virtual versions of people like Stephen King, David Abulafia and Julia Angwin (who filed the lawsuit).

When Superhuman Platform CEO Shishir Mehrotra posted an apology for the agentic feature on LinkedIn, he noted "valid critical feedback from experts who are concerned that the agent misrepresented their voices." When Ann Handley, who identified herself as one of those experts weighed in (before commenting on the post was closed), her primary complaint was "building a commercial feature around experts' names and reputations without asking permission, without notification, and without compensation." While Mr. Mehrotra claimed that "the agent was designed to help users discover influential perspectives and scholarship relevant to their work, while also providing meaningful ways for experts to build deeper relationships with their fans," given that it was a subscription feature, and Superhuman Platform wasn't sharing any of the money, it seemed more like they'd simply found another way to have people work "for exposure." And there's a reason why an increasingly common response to that sort of offer is "Fuck you; pay me."

As a random layperson, the whole thing strikes me as openly unethical; but entirely sensible. If generative automation is a race, and losing carries serious, or even existential, consequences, the time to be ethical is later. Ms. Handley calls Mr. Mehrotra out for an ethos of "take first, apologize later." And while I suspect she's correct in that, it's just like any other instance of "ask forgiveness, not permission;" permission wouldn't have been forthcoming, but forgiveness will be. And this is a rational presumption to make; Uber's known flouting of laws hasn't resulted in the general public deciding that the company is too untrustworthy to do business with. And it's unlikely that the Court of Public Opinion will render a different verdict for Superhuman Platform. Investors, on the other hand, are quick to flee a company that's unwilling to do what it takes to make itself more profitable, and they bear none of the risk for the actions the company takes in pursuit of those profits. It's not like anyone is going to spend time in prison over this, and even if someone were, it wouldn't be the investors; so why wouldn't they push for companies to place profitability over ethical considerations, given that it's unlikely that people and businesses with Grammarly subscriptions are going to go elsewhere.

The only way to stop companies (and people for that matter) from preferring to ask for forgiveness rather than permission is to be consistently unforgiving, regardless of outcomes. And that's a hard sell in a culture where many people's primary focus is their own sense of (or concern for) poverty. People may be angry when someone cheats them to pass the savings along to someone else, but they're often ready to look the other way when the savings are being passed along to them. And businesses know this, their executives are members of the public, just like everyone else. They may often speak in the stilted language of finance and investment, but they're not aliens.

Some heads may roll over this; if he's unlucky, Shishir Mehrotra's will be one of them. But Superhuman Platform, Incorporated will survive. People and businesses will still pay to use Grammarly, and investors will still see returns. And that all but guarantees that "take first, apologize later" will remain the standard order of operations.

No comments: