Friday, December 6, 2024

Penny Dreadful

While I'm on the topic of generative automation tools, I've been noticing a lot of GAT "artwork" on LinkedIn recently. Another think that I've been noticing, albeit less frequently, are people questioning the climate of anxiety over "A.I." reducing employment.

To be sure, the problem with generative automation tools and employment isn't the automation. It's people. Recall this picture that I posted yesterday:

High-octane nightmare fuel.

If you didn't before, take a look at the vaguely Asian-looking woman in the lower right-hand corner of the picture. That can't possibly be the correct number of hands.

Would you imagine that most organizations would pay a professional to create an image for them to use as an invitation to a holiday party that had such obvious errors in it? Or buy a stock photo that was this poorly put together?

When the request of a generative automation system is for "cheerful holiday party," and it returns "Lovecraftian body horror in ugly Christmas sweaters," and the response is: "Meh. Good enough. Ship it," it shouldn't be a surprise that people are afraid for their livelihoods. Because the message being sent is that obvious trash is fine, so long as it's quick and cheap.

And this wasn't the only example of bad art that I found in a cursory scroll through my feed on LinkedIn. Consider this example from Salesforce:

It wasn't worth having someone spend an hour or two to clean this up before using it to advertise a new service? Or to put out a memo saying "Generative artwork needs approval for quality before public posting." I get that "AgentForce" is an automated agent service, but if the agents are as competent at whatever they're supposed to do as the software that created this artwork, it's a hard pass from me.

But I'm sure that they're receiving a decent number of signups. Because they're likely promising that it will be inexpensive. And if inexpensive is more valuable than "borderline competent," then people are going to lose out.

No comments: