Saturday, April 25, 2026

Collared

File under: Isn't it always the same? "Students seeking blue-collar careers face sticker shock."

Sudden and rapid increases in the costs of vocational training strike me as a failure of policy. What's needed is a greater focus on not only helping people see where their best paths for the future lie, but in growing the pipelines to those futures. What's driving up prices are large numbers of people crowding into a space that doesn't have the resources to expand to accommodate them. Allowing to things to get to a point where people are beginning to panic about their futures and then hoping that the for-profit actors who enter the space will place as great an emphasis on quality education as they do on  maintaining profitability for owners and investors is a recipe for bad outcomes. Because it's not like we haven't seen this play out before. Private, for-profit schools spend heavily to market themselves to prospective students (and their parents) and that expenditure has to be made up somewhere along the way.

In the end, it's like any other gold rush. The fastest path to wealth is not to be a miner, but to sell picks and shovels to the people who expect to use those tools to better themselves. Sooner or later, presuming that it hasn't happened already, some unscrupulous operator is going to open a school and decide that actually giving the students the tools they need to succeed in a career in the skilled trades is simply too resource intensive. And it only takes one to ruin a lot of lives, perhaps irrevocably. And this is going to happen because, as a society, the United States does not value the sort of planning and oversight that it takes to prevent it. "[A]n aspiring aircraft maintenance technician must shell out $40,000 for a 14-month course in Florida," because the up-front resources to ensure that there were enough programs to keep the cost down weren't spent. Meanwhile, Governor DeSantis recently signed legislation to ban local diversity, equity and inclusion programs, and block carbon taxes in the state. And this does precisely what to increase access to (and thus lower costs for) blue collar training programs? Hell if I know. But it projects to the Republican activist class that he shares the values that are important to them.

Just like when I receive yet another political fundraising e-mail (note the last time I interacted with a political campaign was in 2004) here in Washington (the opposite corner of the lower 48); there's nothing about training people for the skilled trades, or other jobs of tomorrow. It's hyperbolic warnings of how the world will come to an end if I don't start writing checks.

And this is why there are failures of policy. Because there tends to be little or no real concern for them until people are being pulled out of the wreckage and the hunt for guilty begins. I'm constantly reminded of George Will's statement that the United States does not attempt to prevent disasters; it simply cleans up after they happen. Despite the fact it's a bad habit, it's constant enough that one can count on it.

Friday, April 24, 2026

Mystery

Some things are just going to be mysteries. But I also know that I’ll always want to know, that I’ll always want everything to fit together nicely and neatly into a workable pattern that explains everything. And perhaps not coincidentally, tells me that I really see things as they are.

Imagine, If You Will...
Somewhere, in the past 12 years, that changed. The desire to know, the desire for things to fit together and, perhaps more importantly, the desire to understand that I see things as they really are, went away. I've become comfortable with the yawning chasms that dot my worldview; so much so that if I hadn't written in this blog that I hadn't, I would never have recalled it. (Which, honestly, is one of the things about writing it; it provides insight into my past self that memory alone is not up to the task of.)

I am reminded of the fact that I am poor at predicting the future, even when it pertains directly to myself. But I am also reminded of the impermanence of personality, and perhaps even the self. Back in 2014, I clearly had no inkling that my need to know and understand would change. I don't recall having been working to alter it at the time. But it has, in fact, shifted. I'm much more at peace with the idea that there will be mysteries in the world, and I've come to believe that it's hard to ever claim one knows anything while also being unwilling to be wrong. I think that I've become more comfortable with believing in general, and the understanding that I believe as I do not because it is demonstrably correct, but because it works well enough for me that I can get by on a day-to-day basis.

If I'm still around, and writing this, in 2038 (given the way my family has worked, that's very much up in the air as of now), perhaps I'll see further change in myself. Or whomever I am then.

Wednesday, April 22, 2026

Duped

I came across a LinkedIn post that was illustrated with a comic that in guessing was created by generative automation. Having an LLM create a brief comic in the style of XKCD, so that one can avoid drawing literal stick figures for themselves contributes to a world in which people will see something that looks like XKCD, and wonder whether it was created by a random computer somewhere, or if Randall Munroe has decided to sell out and shill for some random thing.

Not really XKCD

It occurred to me that this dilution of trust in XCKD isn't a problem for the people who use generative automation to copy it... but for Mr. Munroe, this has consequences, now having to pay costs for other people's actions among them.

Along with all of its other capabilities, generative automation can be an effective way to externalize costs. Because it doesn't matter if someone makes $100 from being creative, being efficient or saddling someone else with the bill; it still spends the same. And the more people come to feel that they're the ones left holding the bag for the benefits other people are receiving, the more pressure they will feel to externalize their own costs, just to keep up. Because that's nothing new; most likely, it's worked that way for all of human history.

That lack of a genuine functional difference between providing value and externalizing costs has always been a primary reason why technology doesn't live up to the promises made on its behalf, namely that the relationship between people and businesses will be partnerships; symbiotic, if you will. Because since a parasite doesn't contribute anything in exchange for the resources it receives, parasitic returns are necessarily higher than symbiotic returns. It's the same incentive that drives any form of rent-seeking; it exists when it's less capital-intensive than providing value.

And so the question becomes: How much parasitism can a system withstand before it begins to die? This is especially important in scenarios where the parasite can survive the death of the host; if people using generative automation to copy someone ruin that person's credibility, they can simply go on to copying someone else. It's a tragedy of the commons; there's a positive disincentive to preserve the original, if all that happens is someone else benefits. And eventually, all that's left is a wasteland.

Monday, April 20, 2026

Rejected

And the kind of helplessness that people feel, that leads to this kind of violence, is also unacceptable. And it's worth more scrutiny, from both the industry and our political leaders.
Nilay Patel. "Ronan Farrow on Sam Altman's 'unconstrained' relationship with the truth." Decoder with Nilay Patel. Thursday, 16 April, 2026.
Mr. Patel was giving an obligatory condemnation of violence, in response to the attacks on Sam Altman's home, which took place between when the Decoder episode was recorded, and when it was released. And I use "obligatory" here deliberately. Not in the sense that Mr. Patel felt some sort of pressure to make a statement that he didn't agree with, but in the sense that speaking out against violence is something that's expected. Mr. Patel had noted that the attacks on the Altman home didn't come up during the actual discussion with Mr. Farrow, and so it was clear that he was looking to head off criticism over that.

But what stood out for me was his labeling of a feeling of helplessness as "unacceptable." It seems that he was casting the blame for such emotions on the generative automation industry and the government, but the short statement that he made didn't offer anything to be done about it, other than have it scrutinized. Which is unlikely to happen. Because the kind of helplessness that people feel, that then leads to violence, has been around for quite some time. One wonders just what it would be about Sam Altman that would inspire people to look into it more deeply when the same people who Mr. Patel expects to do the looking have done such an excellent job of ignoring all of its previous incarnations. And the general public hasn't yet cared enough to punish them for it.

Because when people like Mr. Patel make the obligatory condemnations of violence, and advocate for someone (else) to do something about it, they tend not to offer an accountability mechanism to ensure that it's done. And maybe that's because, in the face of violence, they also feel a kind of helplessness, perhaps born of the realization that while they may have an audience, it's fairly tenuous. The public wants what it wants, and so while there are any number of people who will insist that the media leads the public, I'm of the opinion that the public more often leads the media.

And the public doesn't really have a problem with helplessness leading to violence, so long as it's directed somewhere else. Mainly, I think, because people don't see any other options. While Luigi Mangione is quite some distance from being a hero to the general public, there wasn't much in the way of condemnation for the killing of Brian Thompson on the grounds that it had foreclosed on, or even ignored, some better way of dealing with the problem. And so while Mr. Thompson's murder didn't solve anything, it did give people the idea that "one of the bad guys" had received what was coming to him. And, I suspect, had Mr. Altman been killed when his home was attacked, the same sentiment would have surfaced.
I don't think you can win [the War on Terror]. But I think you can create conditions so that those who use terror as a tool are less acceptable in parts of the world.
President George W. Bush. (NBC's "Today" show, 30 August, 2004.)
Creating conditions so that those who use violence as a tool are less acceptable requires large-scale disapproval of violence for its own sake, rather than out of disapproval for the specific ends to which violence (or terror) is being deployed. Even when those ends are punishing wrongdoers or acting in perceived self-defense. Violence of the sort that gains some level of public acceptance tends to occur when someone sees it as a reasonable response to the other person's actions (or inaction). It's rare for people, even a minority, to celebrate escalation. And the angrier and more upset people are, the less likely they are to see any given level of violence as an escalation.

I think that Mr. Patel's call for "industry and our political leaders" to scrutinize a general feeling of helplessness that then comes to be seen as the result of aggression against people, and therefore, a rationale for violence, may let the public off the hook, out of an agreement with the idea that most everyday people are, in fact, helpless. And maybe that's the problem that needs solving. But I think that the general public will need to be the ones who solve it. Which, when social trust is remarkably low, it something of a tall order. But trust is, in a lot of ways, a choice. So maybe step one is convincing people to make different ones.

Thursday, April 16, 2026

Fabulous

In the end, LinkedIn is a social media site. And like any social media site, it has its share of people pushing dubious, but popular stories. Like this one, borrowed from X, I believe...

I'm pretty sure this story is bogus, because it doesn't make any sense...

A bot can't simply "hallucinate" a discount code. It has to create the code and the discount amount/percentage, then tell the sales (or whatever) database to allow it. Then it has to be advertised to customers, or simply applied to some or all orders. Any company that's allowing all that to happen in a production environment with no checks whatsoever is already being pretty badly mismanaged.

The development lead shouldn't need the former QA lead to tell him how to fix the problem. They simply go into the database and de-activate the discount code, presuming that this requires direct intervention from the developers at all, which strikes me as unlikely in any mature organization. If the "bot" had rewritten the code that managed discounting so that the code couldn't be turned off in production, the former QA lead isn't the person to describe how one fixes that. The QA lead would tell the development lead how to test for it.

If there's a legitimate use case for a 100%-off discount code, then it's entirely possible that it passed testing. Likewise if there's a legitimate use case for applying a discount code to all orders for a given amount of time (such as a promotion). It's rational to have a policy against applying discount codes of a certain type universally, but unless that policy's been fed into the system somewhere, it's reasonable the system wouldn't test for it. Accordingly, this is one of those things that could conceivably get by human testers, especially if they're using automated test tools, and not doing the testing manually, because it might not occur to anyone to ensure that a universal 100%-off code doesn't work unless there's something specifically in the specifications that demands it.

I get it, though. A lot of people are unhappy about the level of automation being deployed into the software and e-commerce industries, and the jobs being cut as a result. And it's hard to find someone who would never believe that corporate executives are capable of being penny-wise but pound-moronic. But having some limited experience in e-commerce and more experience as a QA manager, this story simply doesn't resonate with what I learned during those parts of my career. It may be framing the guilty, but it's a frame nevertheless, and it doesn't serve anyone to believe false stories of executive perfidy or generative automation malfunction.