Saturday, March 21, 2026

Uninstall

 

While the fact that Facebook is a privacy disaster has been understood for some time, I don't think that this sticker, which I found on the back of a parking sign, will influence all that many people to leave the platform. Facebook's network effects have lead to a high level of lock-in for their users, many of whom have made the site the primary, if not the only, way to find them online. The fact that constant digital surveillance is the price of that had become understood at this point.

Friday, March 20, 2026

Differentiated

It's been some time since I've used a graphic for The Short Form, mainly because, as I've mentioned before, it's hard for machines to read the text in a picture. But I was talking with an acquaintance earlier this week, and this part of our conversation stuck with me.

There is a difference between preventing bad outcomes, and preventing them from happening to oneself.

I suppose that it's an obvious sentiment, but I don't know that it's thought about all that much. In a lot of ways, it's like the difference between using The Club, and installing a Lojack, or other locator system, in a car. The Club is an obvious theft deterrent; it's goal is to not only make it more difficult to take the car, but to be obvious about that fact, so that the would-be thief moves on. But it doesn't really change their incentives; they simply look for a car that doesn't have such a device, and attempt to steal that one.

LoJack, and other locator systems, on the other hand, while being inobvious, carry a much greater risk to the thief if they do, in fact, steal the car... after all, it can be tracked by law enforcement, and that leads to a higher chance of being caught in a stolen car. But the fact that one cannot tell by looking if a car is equipped with a locator means that taking any car in a neighborhood where they're known to be in use carries higher risk. And this is why these systems often carry discounts in insurance premiums, they lower costs of insurers more broadly, and it's worth passing some of that savings on to those to have the systems installed.

This all came up in the context of the supposed generative automation apocalypse that's coming for certain sectors of the knowledge workforce. While a lot of people are offering various advice, from learning how to supervise automated systems to dumping the industry entirely and shifting to skilled trades, the general viewpoint is the same: This is going to happen, here's how you take care of you. It's modeled on The Club... a car is going to be stolen; this is how you ensure that it isn't yours. But maybe a LoJack model, trying to head off the worst of the transition in general, for everyone, would be better for all involved.

Wednesday, March 18, 2026

This Side, That Side

"Long term, you tend to remember that kind of negative branding," [University of Alabama Marketing Professor Karen Anne] Wallach said. "And negative language then becomes part of what you associate with the brand."

The tech startups NPR spoke with for this story said they understand the risks of alienating large numbers of people with their cryptic ads. But the upside is too great.
Do you understand this billboard? If not, that's the whole point
While this might seem to be just another story about tech, and how it divides people into groups, the above points to something important about in-group and out-group signalling. Sometimes, alienating the out-group is what the in-group demands. Groups, in general, are defined both by who is a member of the group, and who is not. And for groups that want to maintain some sort of claim to exclusivity, who is kept out can be much more important that who is let in. And hurt feelings on the part of those kept out be damned.

For technology startups who are not attempting to sell themselves to the general public, the idea that the general public is unwelcome can be just the sort of thing that their intended customers want; because it not only sorts, but stratifies. And sometimes, nothing sells a product or service like the idea that being a member of the target audience is proof of one's own superiority.

If an advertiser is willing to accede to an expectation of flattery, even at the expense of others, on the part of the in-the-know, clearly neither the advertiser, nor their audience, expects that any hard feelings on the part of the out-group will be a problem for them. And this is nothing new. I would submit that it's been a facet of human history for as long as there has been history. That said, it doesn't make the practice any less toxic, especially in its more strident forms. But perhaps that's the problem; toxicity has become such a common part of people's everyday lives that it goes unnoticed.

Monday, March 16, 2026

To Be Divine

Superhuman Platform, Incorporated, the company formerly known as Grammarly, is facing a class action lawsuit over a feature it rolled out at the end of the Summer called Expert Review. Expert Review, which was recently removed, was effectively a "this person would make these suggestions about what you're writing," sort of feature, and claimed to offer advice from virtual versions of people like Stephen King, David Abulafia and Julia Angwin (who filed the lawsuit).

When Superhuman Platform CEO Shishir Mehrotra posted an apology for the agentic feature on LinkedIn, he noted "valid critical feedback from experts who are concerned that the agent misrepresented their voices." When Ann Handley, who identified herself as one of those experts weighed in (before commenting on the post was closed), her primary complaint was "building a commercial feature around experts' names and reputations without asking permission, without notification, and without compensation." While Mr. Mehrotra claimed that "the agent was designed to help users discover influential perspectives and scholarship relevant to their work, while also providing meaningful ways for experts to build deeper relationships with their fans," given that it was a subscription feature, and Superhuman Platform wasn't sharing any of the money, it seemed more like they'd simply found another way to have people work "for exposure." And there's a reason why an increasingly common response to that sort of offer is "Fuck you; pay me."

As a random layperson, the whole thing strikes me as openly unethical; but entirely sensible. If generative automation is a race, and losing carries serious, or even existential, consequences, the time to be ethical is later. Ms. Handley calls Mr. Mehrotra out for an ethos of "take first, apologize later." And while I suspect she's correct in that, it's just like any other instance of "ask forgiveness, not permission;" permission wouldn't have been forthcoming, but forgiveness will be. And this is a rational presumption to make; Uber's known flouting of laws hasn't resulted in the general public deciding that the company is too untrustworthy to do business with. And it's unlikely that the Court of Public Opinion will render a different verdict for Superhuman Platform. Investors, on the other hand, are quick to flee a company that's unwilling to do what it takes to make itself more profitable, and they bear none of the risk for the actions the company takes in pursuit of those profits. It's not like anyone is going to spend time in prison over this, and even if someone were, it wouldn't be the investors; so why wouldn't they push for companies to place profitability over ethical considerations, given that it's unlikely that people and businesses with Grammarly subscriptions are going to go elsewhere.

The only way to stop companies (and people for that matter) from preferring to ask for forgiveness rather than permission is to be consistently unforgiving, regardless of outcomes. And that's a hard sell in a culture where many people's primary focus is their own sense of (or concern for) poverty. People may be angry when someone cheats them to pass the savings along to someone else, but they're often ready to look the other way when the savings are being passed along to them. And businesses know this, their executives are members of the public, just like everyone else. They may often speak in the stilted language of finance and investment, but they're not aliens.

Some heads may roll over this; if he's unlucky, Shishir Mehrotra's will be one of them. But Superhuman Platform, Incorporated will survive. People and businesses will still pay to use Grammarly, and investors will still see returns. And that all but guarantees that "take first, apologize later" will remain the standard order of operations.

Sunday, March 15, 2026

One of Three

I started listening to the most recent episode of EconTalk, in which Professor Roberts interviews one Hanno Sauer about the latter's new book: The Invention of Good and Evil. I have to admit that I gave up not too long into it, in part because of this statement from Mr. Sauer:

 And, now you get the opposite problem when you move to a naturalistic Darwinian framework. All of a sudden, the default assumption seems to be that it's 'nature, red in tooth and claw.' It's dog-eat-dog, it's elbows out. Everyone is selfish. Everyone is essentially sociopathic. Right?

And, now you get the problem: Okay, evidently there is friendship and heroism and love and altruism and sacrifice. But, where do those come from? It seems to not make any sense.

It irked me, because the basic idea that, under "a naturalistic Darwinian framework" that "everyone is essentially sociopathic," doesn't actually come out of any of Mr. Darwin's work. As I noted in my (unfinished) blogging of my way through On the Origin of Species:

There are three distinct facets to the Struggle for Existence, as Darwin explains it - competition within a species, competition between species, and mitigating the hostile effects of one's environment.

Mr. Sauer's book, rather than seeking to correct the misconception that the "default assumption" should be that competition within species is the norm, leans into it. And I found myself asking why. Or, on the larger scale, why does the misconception persist so? I can't possibly be the only person who has read Charles Darwin, or recalled that person-to-person competition is only part of one of three primary conflicts that Mr. Darwin identifies. So why don't more people push back against it? Why accept the hostile framing that the idea that "the Darwinian view of Evolution requires one to be murderously pseudo-Machiavellian" and then try to argue that unselfishness can grow within it, when it strikes me as much easier to point out that "friendship and heroism and love and altruism and sacrifice" make the other two conflicts much easier, and start from there?

Speculation on other people's motives is often a one-way ticket to creating a strawman argument, so I won't indulge in it, other than to say that there must be incentives at play that I am either unaware of, or not fully crediting. Because while it may seem unreasonable to me, there are assuredly reasons for it that people feel are worthwhile.

Of course, it may simply be that the misconception is widely held enough that people don't always realize that it is, in fact, a misconception. It's like Fyodor Dostoevsky's bit of dialog in The Brothers Karamazov, where Ivan notes: "If God does not exist, anything is permissible." This is commonly taken to be absolutely true in much of the Western world, especially by Christians, despite the fact that there is nothing in the viewpoint of Moral/Ethical Realism that requires some sort of divinity to create the rules, just as there in nothing in Mathematics that demands some sort of divine order for 2 + 2 to equal 4. Perhaps it's just easier to set out to prove the argument incorrect than to point out that it doesn't actually seem to make any sense, given the world as we understand it.