Wednesday, March 27, 2024

Hype Technology

"The core problem is that GenAI models are not information retrieval systems," [AI ethics expert Rumman Chowdhury] says. "They are synthesizing systems, with no ability to discern from the data it's trained on unless significant guardrails are put in place."
Chatbot letdown: Hype hits rocky reality
Of course generative artificial intelligence doesn't live up to the hype. If it did, one could make the case that it wasn't hyperbole. But it generally turns out to be hype, if for no other reason than there seems to be something deeply alluring about the idea that people can build something, and it will just solve a bunch of problems, and not have any downsides. In the time that I've been in the technology sector, I've seen a consistent discomfort with the idea of trade-offs, even when the reality of them is metaphorically staring people in the face, and generative artificial intelligence is no exception.

When I've experimented with Microsoft's Copilot system, I haven't found it to go off the rails in the way that many earlier systems may have, but it is verbose, because its default is take in whatever data it's given and to synthesize more. Back when I used the tool to help me translate an old story snippet I'd written in Japanese into English, it volunteered a prompt, requesting it to tell me how the main characters met. And then it synthesized a story; it had no other choice, the characters it offered to tell me about had no other existence beyond the short passage that I'd written more than two decades ago; there couldn't have been any information about them in the training data. And I can see how that lends itself to an interpretation that the model "knows" things, and that asking it more questions will reveal more information. But that requires seeing it as something more than a remarkably sophisticated auto-complete function. Which it isn't.

That said, there are several things that one can do with a really sophisticated auto-complete function that will make businesses, and potentially people, more efficient and productive. But for right now, they're mainly limited to applications where it becomes evident fairly quickly if the system has it correct or not. I knew that the AI systems made errors in my initial experiment, to determine the length of time between two dates, because I asking the question with the goal of having the systems tell me the answer; I already knew the answer, because I'd sorted it out for myself. I was looking to see the degree to which the various models disagreed with one another. But if I'd been asking because I genuinely didn't know, and had used the answers provided for anything important, that could have spelled trouble.

The term generative artificial intelligence is a misnomer because the systems involved are not intelligent. As Ms. Chowdhury notes, the systems lack the native ability to discern things from the data they were trained on. But they're treated as thinking, and as knowing, because that's how they appear to people. Copilot, when it tells me that it's taken a "whimsical journey" (otherwise known as synthesizing random details about something), behaves as though there is a creative intellect in there, somewhere. And I think that this, combined with the speed of its answers, makes it easier to see the system as smarter than a person. And since any problem can be solved if one is just smart enough...

Except, that's not true. There are plenty of problems with the world that are going to take more than a person, or a machine, being clever about solutioning. I was listening to a podcast about Haiti today. That doesn't see like a problem that's mainly here for want of a clever solution. Likewise, the question of workers displaced by continued adoption of automation is also not a problem that will yield to cleverness. Like many things that don't live up to the hype, the problem is overly optimistic impressions of what the technology can do.

Monday, March 25, 2024

Faith Based

Every year. Pew Research Center conducts a study on both governmental restrictions on religion and social hostilities involving religion. This year's report made for interesting reading.

The report was at pains to point out that it "is not designed to determine which religious group faces the most persecution." Which was a shame, really. Clearly they understood that religious partisans would be combing the report looking for evidence to back up their claims to being The Most Oppressed, presumably in the service of demanding more resources and protection for themselves. Granted, the report offers the opportunity to indulge in a sense of victimization. It notes that Denmark requires that animals be stunned prior to being killed for meat production, and that this makes it more difficult to obtain Kosher or Halal meat, but it doesn't specify why this is about government harassment of a religious group, as opposed to an animal welfare/anti-cruelty measure. Similarly, it calls out restrictions on the ability to claim conscientious objector status (or be exempted from otherwise mandatory military service) or to hold in-person gatherings in the face of public-health orders to the contrary to be examples of government interference in worship. This gives the impression that simply having to follow the same rules as everyone else can be viewed as governmental restriction on religion.

Likewise, the report seems to code simple disputes between religious communities, and communities that happen to have different religious beliefs as a form of social religious hostility. For example, it was noted that Bolivia's social hostility score went down because "there were no reports coded in 2021 that Protestant pastors and missionaries were expelled from Indigenous communities for not observing Andean spiritual beliefs." (This raises an interesting question; when one group wants to proselytize, but the leadership of another group does not want their community proselytized to, who can claim the hostility? While the expulsion of missionaries seems like a clear case, it's worth noting that for many missionaries, the end of other religious beliefs is their stated goal.) In Nigeria, conflicts between “predominantly” (quotes in original) Christian farmers and Moslem herders are framed as sectarian social hostility, despite the fact that conflicts between herders and farmers have been taking place for nearly the whole of human history.

None of this is to say that the situations and incidents mentioned aren't religiously motivated (especially the expulsion of missionaries) but I did find myself questioning what the expectation of religious entitlement was. Governments enact laws with disparate impacts due to other factors all the time, and fighting between groups is pretty much the one constant to be found in human history. Why people should expect that, for example, only secular buildings should be subject to vandalism, or that clergy of faiths that claim an exclusive understanding of truth would refrain from public criticism of attempts to propagate "incorrect" teachings is never addressed.

Religion is often viewed as being a higher-stakes enterprise than other aspects of one's daily life. If I attempt to convince someone that they might also enjoy building plastic model kits, someone close to them might object on the grounds that it can be expensive or time-consuming. But were I to attempt to convince someone that their deity isn't real, I could be seen as attempting to set them up for a punishing, rather than pleasant, afterlife, or some other form of real spiritual harm. Not everyone believes that all religions are equally valid. (Or, as the late Christopher Hitchens has put it, equally demented.)

And that might be the most curious thing about the report. It posits a world in which no-one ever fights over religion; one in which immoral teaching and leading people away from true faith may be possible in the abstract, but aren't seen as worthy of any real-world actions. The stakes are not simply low, they're non-existent. But that's not how religion in the world actually works. And it's unlikely to ever do so.

Friday, March 22, 2024

Springtime

Taken while walking around the neighborhood. Spring came early this year, and settled in to stay.
 

Wednesday, March 20, 2024

A You Problem

I was reading "The psychological battle over trauma" as part of a deeper dive into the phenomenon of "therapy speak," and came across the following passage:

The psychotherapist Alex Howard, author of It’s Not Your Fault, distinguishes between overt trauma, as described by Bonanno, and covert trauma, this less tangible, nevertheless traumatic experience. [...] But this covert trauma, for an increasing number of clinicians, explains why we are the way we are. And through this interpretation, we are moving our conception of mental health away from “what’s wrong with you” and toward “what happened to you?”
The title of Mr. Howard's book is telling, and perhaps points to the root of the problem, at least here in the United States. American society is, in a number of ways, focused on efficiency: how to derive the highest returns from any given set of inputs. But it also manifests itself in a drive to decrease the inputs while maintaining the same returns. And labor is one such input.

The material needs of the United States can be satisfied, generally speaking, without needing the entire populace of the nation to work. One could make the case that there is unrealized demand in some or all sectors of the economy, as the United States' high levels of inequality have the effect of suppressing demand at the lower levels of the income and wealth distributions, but as things are currently structured, the United States effectively has an excess of labor capacity. The fact that the United States has a weak system of social supports, given that it is an industrialized and expensive society, means that this excess capacity becomes competition for work. Likewise, technological advances (and differentials in education) have led to that competition being international. The result being that an unemployed American can find themselves competing with workers literally on the other side of the globe for opportunities. For those people who hold opportunities, and thus can distribute them to others, this creates a wealth of choices born of a flood of candidates. And so a means of discrimination is required. And "something is wrong with this person" is as good a means of sorting as any.

Part of the rationale behind the adoption of "therapy speak" is overt effort on the part of people to say "Whatever flaws you may perceive in me, they aren't my fault. Nothing's wrong with me; something happened to me." This is a sub-optimal viewpoint on the subject, because it buys into the hostile framing of the underlying concern that the "judge" brings to the question, namely: "I have learned this bad thing about you, and it legitimately disqualifies you from the opportunity to work to support yourself." Of course, these sorts of questions extend beyond work; people deploy variants on "It's not my fault," in all sorts of situations, and many of them serve to legitimize what should be understood as the basic problem; the continuous need to find fault with others as a means of justifying the choices one makes concerning others.

Were it up to me, I'd steer society away from it's current apparent level of buy-in to a culture of stigma. But I understand that it's a tough sell. While I was never a big fan of Senator Bernie Sanders, I think that his perception that one of the primary factors driving prejudices is the perception of scarcity is largely correct. While it's true that there are people out there for whom preventing people from meeting their needs is an end in itself, for many people, the competition for resources pushes them to developing ad-hoc heuristics that sort themselves into the group of people who are deserving of access, while keeping enough other people out that a perception of shortage is averted. In other words, instead of viewing scarcity as the problem to be solved, resource distribution to the undeserving is the concern. Assigning stigma to others, then, becomes a solution, even if it isn't a good one.

Monday, March 18, 2024

Torched

New York City drugstores are so rife with plastic lockup cases that one crook was forced to use a blowtorch to blast one open, making off with $448 in skin care products.
Retailers pile on new tech to deter theft
I'm going to admit that I hadn't seen that one coming. I understand investing in some amount of equipment and going through some amount of effort in order to get one's hands on something, but blowtorching open a display case for less than $500 in stuff (especially given that it won't sell on the street for that much) strikes me as over to top. But I suppose that it shouldn't. After all, whether one sees retail thieves as done in by economic conditions or systems that have rendered them unemployable, or simply too lazy or venal to find honest work, $450 dollars in "free stuff" is attractive all the same. And blowtorches aren't that expensive.

I'm originally from the Chicago area, and I've been in parts of the city that could teach prisons a thing or two about security. It's strange to walk through a neighborhood where literally every window accessible from the ground has heavy bars to prevent people crawling in, or to go into a fast-food restaurant where the counter sports thick, bulletproof plexiglass, with a turntable through which money and food can be passed. Strange, but apparently not newsworthy.

What I think has been driving the current push of news stories about retail theft is precisely the fact that it's spread from benighted and forgotten neighborhoods on the South side of Chicago and out in the suburbs and downtown areas where wealthier people shop. So it's now confronting people who can profess to be shocked and upset by a state of affairs that other people have been attempting to deal with for some three decades, if not more. And shock and upset drive attention.

Personally, this is the sort of thing that calls for solutions journalism, and not simply the solutions of increased surveillance and buying into trading personal data. But a solution to the problem of... I'd say poverty, but I think it's more leaving people behind. Of course, that presupposes that there is a solution to a situation that's persisted as long as it has precisely because it works for people. Or, at least, for enough people that the will to pay the price of fixing it isn't there. Anti-theft technology will do a good enough job at a good enough price to be a viable patch on a difficult problem. It's sometimes disappointing that it's all society asks.