I came across a LinkedIn post that was illustrated with a comic that in guessing was created by generative automation. Having an LLM create a brief comic in the style of XKCD, so that one can avoid drawing literal stick figures for themselves contributes to a world in which people will see something that looks like XKCD, and wonder whether it was created by a random computer somewhere, or if Randall Munroe has decided to sell out and shill for some random thing.
 |
| Not really XKCD |
It occurred to me that this dilution of trust in XCKD isn't a problem
for the people who use generative automation to copy it... but for Mr.
Munroe, this has consequences, now having to pay costs for other
people's actions among them.
Along with all of its other
capabilities, generative automation can be an effective way to
externalize costs. Because it doesn't matter if someone makes $100 from
being creative, being efficient or saddling someone else with the bill;
it still spends the same. And the more people come to feel that they're
the ones left holding the bag for the benefits other people are
receiving, the more pressure they will feel to externalize their own
costs, just to keep up. Because that's nothing new; most likely, it's
worked that way for all of human history.
That lack of a genuine
functional difference between providing value and externalizing costs
has always been a primary reason why technology doesn't live up to the
promises made on its behalf, namely that the relationship between people
and businesses will be partnerships; symbiotic, if you will. Because
since a parasite doesn't contribute anything in exchange for the
resources it receives, parasitic returns are necessarily higher than
symbiotic returns. It's the same incentive that drives any form of
rent-seeking; it exists when it's less capital-intensive than providing
value.
And so the question becomes: How much parasitism can a
system withstand before it begins to die? This is especially important
in scenarios where the parasite can survive the death of the host; if
people using generative automation to copy someone ruin that person's
credibility, they can simply go on to copying someone else. It's a
tragedy of the commons; there's a positive disincentive to preserve the
original, if all that happens is someone else benefits. And eventually,
all that's left is a wasteland.