Friday, July 31, 2020

The Right Tool

Dina Solman, of Graham, Washington has a problem. Her granddaughter/adopted daughter Jasmine, who "has been diagnosed with ADHD, autism, oppositional defiant disorder, disruptive mood disorder and reactive attachment disorder, among others" and "has run away, started fires, kicked holes in the wall, choked her sister as well as Solman and has been suicidal and homicidal."

“I have to walk on eggshells, because I don't know how she's going to react,” Solman said. “When I say something, when I do anything that makes her mad, you don't know [what she’ll do].”

Now, she just calls the police. And waits.
Kids desperate for inpatient psych care have few options in WA
Ms. Solman is left calling the police because there aren't any psychiatric beds available for Jasmine to be admitted to. But are armed officers really the best people to respond to things when Jasmine goes off the rails? After all, the child is only 11.

And this is the piece that's quickly been forgotten about in the "Defund the police" debate. Shouting matches over whether reducing the police presence in minority neighborhoods will result in some sort of renaissance or a descent into lawlessness and chaos are sexy, and allow people to wave their virtue flags high, but a large part of the original point is that the police are often called upon, and funded, to handle situations that don't require an armed response, while more appropriate resources go begging for money. Money is funneled to the police to pay them to deal with situations that they aren't trained to deal with. And what are they going to do with an 11-year-old who's violently acting out? The police don't have inpatient youth psychiatric beds, either. And they're not cheap: Helen Caldart, a special education advocate the Special Education Advocates League, who works to place children in facilities that can manage their behaviors sees prices basically start at $10,000 per month, and go up from there. That's higher than the median and mean family incomes in the United States.

The lost angle of the "Defund the police" debate is this question: Are we directing resources to the police when we should be directing those resources to other places? Are we expecting the police to not only be specialists in their own areas of expertise, but generalists in a variety of other areas where specialists are needed? And it's not a given that this is the wrong way to go about it. If defunding 10 police officers only gets 10 specialists when 25 are needed, going with the 10 police officers might be the best of the available options. But the current debate, driven by what comes across as a combination of anger and fear, seems incapable of addressing the question at this point, caught up in the perception of a high-stakes battle over right and wrong.

Thursday, July 30, 2020

Walled Off

"Getting your job application through computer firewalls" is an interesting title for a post, as it draws a parallel between computerized applicant tracking systems and a firewall, despite the fact that they have very different functions. But what's interesting about the title is that I suspect it speaks to something that many job seekers would tell you: They feel the relationship between businesses (or, at least their human resources departments) and candidates for employment is becoming adversarial. ATS = Firewall implies that businesses see job seekers as dangerous, something that needs to be kept away from the valuable and fragile inner workings of the company. Rather than being a way to manage the potentially large numbers of applications for open roles that a largely internet-based application system enables, job seekers have come to see applicant tracking systems as deliberate stumbling blocks placed in the way of people who need work, whether that's because it makes it easier to hire preferred candidates, to prevent sympathetic humans from helping candidates bypass unreasonable requirements or simply to shut people out of desirable jobs.

And while the article itself never revisits the ATS as firewall comparison, the fact that it's so prominent in the title seems deliberate. After all, LinkedIn is not above clickbait, and nothing drives clicks like a catchy headline.

Worth It

FYI, Jeff Bezos has added $62.7 billion to his net worth this year.

Amazon also cut $2 an hour hero pay for its 400,000 warehouse workers and doesn't give them paid sick time off. The cost of a year of hero pay and 2 weeks paid sick time is $2.15 billion, or 3.4% of his net worth gains this year.
I suspect that you can see where this is going. Dan Price, the CEO of Gravity Payments, goes on to note that he's drawing no salary this year. But while it's good for like farming on LinkedIn, it's not an apples to apples comparison. Note that it doesn't say that Amazon paid Mr. Bezos $62.7 billion this year, only that his net worth increased by that much. For all we know, Mr. Bezos owns stock in Gravity, and some of his increase in net worth comes from their ongoing success. Likewise, just because Mr. Price isn't taking a salary doesn't mean that his net worth hasn't increased. After all, many assets appreciate over time. Like the business he runs.

I don't know Mr. Price, but I've heard a lot about him. His pledge to make the minimum wage at his company $70,000 a year made headlines. Things have been a lot more low-key recently, the high-profile naysayers who predicted Gravity Payment's demise have moved on to other things. And perhaps this provides the motivation for Mr. Price to keep his name out there on LinkedIn.

But the disingenuous way he does so seems to be more about tapping into anti-corporate sentiment than genuine differentiation. Of course, Gravity Payments doesn't directly compete with Amazon; in that sense, there's not much differentiation to be done. Even so, it's possible to make legitimate observations about differences in business models, and compensation models, without resorting to misleading rhetoric.

Monday, July 27, 2020

Style and Substance

The thing that stood out for me about the one Black person among the anti-racism protestors was that he'd created a sign that could be easily read from a distance. The thick, outlined, lettering actually had enough heft to it that it wasn't swallowed by the background. And this allowed it to be legible from a decent distance away.

I see this sort of thing a lot when people create signs and hold them up on street corners. They're concerned with projecting their message, but don't know how to ensure that it can be understood. But no matter how strong the signal, if it can't be received it may as well have never been sent.

Given that it's election season here in Washington, there are campaign signs everywhere, and they offer an interesting contrast. While some of the candidates' signs are amateur productions, and lack polish, for many of them, you can see everything on them that the candidates, and their campaigns, wish you to see.

Sunday, July 26, 2020

Awwww... Freak Out!

“Some Republicans are much less freaked out by the virus than they were a few months ago,” said Marc Hetherington, a political scientist at the University of North Carolina who is tracking Americans’ perspectives of the coronavirus through a panel survey. “But things are changing so quickly — these new outbreaks could scare them and maybe some of that polarization disappears.”
Republicans And Democrats See COVID-19 Very Differently. Is That Making People Sick?
One thing that has become something of a pet peeve of mine about the SARS2-CoV outbreak is the degree to which "freaking out" or "being scared" are equated with having a clear-eyed view of the risks involved and having an appropriate tolerance (or intolerance, as the case may be) of those risks. Mainly because fear and freaking out, are, more or less by definition, the opposite of clear-eyed and rational. In the same way that "outraged" and "attentive" are not synonyms, "concern" and "acknowledgement" aren't either. When the Pew Research Center asks people: "How concerned, if at all, are you that you might spread the coronavirus to other people without knowing that you have it?" that's a valid question. But it's not the same question as: "How, if at all, likely do you believe it is that you might spread the coronavirus to other people without knowing that you have it?"

While American English tends to use "I am afraid/concerned X will happen," to mean "I think there is a likelihood that negative event X will happen," those sentences are independent of one another and each of them can be true when the other is not. After all, people admit to having unreasonable fears (say, being bitten by a typical household spider), or express acceptance of risk (driving frequently) all the time

In any event, I'm not particularly concerned that there is a large enough cohort of the American public that understands the risks and takes them in stride that the numbers that Pew gathers are somehow fundamentally inaccurate. And as for the idea that it's become normalized to see fear as the only rational response to an unknown (and for many of us, myself included, unknowable) risk, well, that ship has already sailed. (I'm pretty sure it collided with an iceberg, and is now resting comfortably at the bottom of the ocean.) Like I said, this a pet peeve of mine. And that peeve is that it's hard to express nuance in a situation in which nuanced language is commonly squeezed out.
Some surveys offer a glimmer of hope, suggesting that the partisan gaps in how people are actually behaving — whether they wear a mask, for example — are much narrower than the divides on questions about what they think the government should do in response to the virus or whether the worst is behind us.
Why wouldn't one expect that measures of personal behavior would differ from questions of what someone else may or should do, or what the future will hold? A person whose head is under water is going to hold their breath, whether they expect to breach the surface again in five seconds or in one minute. Seeing hope as it is presented above strikes me as an overly pessimistic view of the way partisanship works. Whether or not a person takes precautions against an event happening can be independent of their understanding of the likelihood of the event. I keep a fire extinguisher (more than one, actually) in my apartment. I have never, in my life, been in a situation in which a home that I was in, mine or anyone else's, has been burning. I expect that this will continue to be the case, likely indefinitely. But having the ability to put out a small fire is better than not having it, and the cost was low. And, more to the point that I'm making here, if a journalist were to see the fire extinguishers and conclude, from that alone, that I expected a fire to break out in the forseeable future, they'd be laughed at. There would be an expectation that if they wanted to understand what I believed the risk of a fire to be, or my emotional state concerning same, they would have to ask me more directly. There's no reason to extend the current virus outbreak the same courtesy.

Thursday, July 23, 2020

Inattention

I have to admit that I've been paying even less attention than normal to the news recently. The steady diet of infectious disease and racial injustice news has become repetitive and it seems to crowd out, or inject itself into, everything else. Of course I understand that for many people, this is exactly what they're interested in. After all, this is why "doomscrolling" is a thing.But I'm not in that demographic, and so the constant stream of stories has no interest for me. Mostly because they've already reached the point where there's little new information in them. What's happening in the world of the coronavirus today, or what random things happened at the most recent protest just isn't useful. It feels more productive to catch up on all that perhaps once a week, after enough new information has been gathered that stories at least appear to be informative.

But being the sort of person who likes to read, and who likes to feel at least somewhat informed about the world, this has left me feeling a bit out of the loop. So I've been bouncing from thing to thing, trying to pick up on some of all of the other things that have been happening in the world. Which is an interesting experience. Still, given that most of what passes for news is "for entertainment purposes only" (because I'm a cheapskate), I'm not sure that I found it an informative one. But no news cycle lasts forever, and I'm certain that I'll find myself being nostalgic for the current ones, given enough time.

Tuesday, July 21, 2020

Birmingham Screwdriver

"We're sending law enforcement," [President Trump] told reporters. "We can't let this happen to the cities."

He specifically named New York City, Chicago, Philadelphia, Detroit, Baltimore and Oakland in discussing problems with violence.

"We're not going to let this happen in our country, all run by liberal Democrats."

Mr Trump also praised the controversial federal law enforcement efforts in Portland. The city has seen protests against police brutality since George Floyd's death in Minnesota in May.
"Portland protests: Trump threatens to send officers to more US cities." BBC News.
To be sure, President Trump is not alone in feeling that things may be spiraling out of control in some of America's cities. I'm a native of Chicago, and when I spoke to a relative there a few days ago, she expressed alarm at the current level of violence in the city. But the President's methods and rhetoric give many people, myself included, the idea that he's relying on a political instinct that sees the differences between people, and the disagreements those differences spark, as an asset. In other words, President Trump operates by finding disputes, and choosing a side. The question becomes to what degree he, either intentionally or instinctually, stokes those disputes, in order to make the distinctions between the factions sharper.

While it's become common, especially outside of the President's base of support, to hint, if not outright state, that the President likes situations such as the continuing protest movement that has taken hold since the death of George Floyd, because it presents a convenient "distraction" from his inability to deal with the ongoing SARS2-Coronavirus outbreak in the United States, I would suggest that the President has had difficulty finding his footing on the pandemic precisely because it isn't as amenable to "divide and conquer," as it were, as the protest movement is. It's well understood that the President spent time, especially early in the pandemic, looking for a party to cast as the enemy: whether that was China, whom the President accused of either creating or failing to suppress the virus, or Democrats and other left-leaning political factions in the United States, whom the President accused to perpetrating a damaging hoax, designed to remove him from office (and thus prevent him from protecting and advancing his supporters and their interests). And while it may seem obvious that a plan to combat a global pandemic by treating it as a conflict and then picking a side would be doomed to failure, I'm not sure that it is.
I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.
Abraham Maslow, The Psychology of Science, 1966
One of the definitions that Wiktionary supplies for "Maslow's Hammer," as it is sometimes called, is this: "If a person is familiar with a certain, single subject, or has with them a certain, single instrument, they may have a confirmation bias to believe that it is the answer to/involved in everything."* And while President Trump's seemingly endless search for parties to vilify in situations where it seems inappropriate is often attributed to malice, I suspect that it's simply that confirmation bias at work. President Trump's consistent record of success (okay, I realize that many people would dispute how "successful" the President has been in life) with finding disputes and picking a side has demonstrated to him that it is a useful tactic. After all, it's pretty much what enabled him to win the Presidency of the United States, despite having less support than Hillary Clinton. So it's unsurprising that, as President, he's continued to employ the tactic.

Part of what makes the President's personal version of Maslow's Hammer workable for him is that he only concerns himself with the interests and opinions of whichever side he's chosen to ally himself with. And so to the degree that they come out ahead, it can be chalked up as a victory. And when they lose, the other side is always there as a convenient scapegoat, open to charges of cheating and other moral failures. But it's also worth keeping in mind that he doesn't actually have to share the worldview or interests of the faction that he's chosen. A panderer doesn't have the be a true believer. But this can also bite him.

In the case of George Floyd, President Trump initially started out by condemning the injustice done. He pivoted somewhat, later, once the protests started gaining steam, and his base of support began to see the reaction as lawless, rather than justifiably angry. In the case of the SARS2-CoV outbreak, the lines are a lot less stark, and there is much less of an idea that a crackdown on political enemies will solve anything. The United States has seen somewhere in the area of 130,000 confirmed deaths attributed to the outbreak in the past four or so months. It's unclear to me, and I think to a lot of people, how a clear partisan victory somewhere along the way would have reduced that number. And so the President's habit of treating the outbreak as a dispute between people simply makes him appear incompetent to those people who are not disposed to believe that the numbers are falsified.

The protest movement is a logical choice for the President to focus on because, while the social justice issues involved are not nails, the protests they've spawned are more nail-like than the pandemic. And thus the President's habit of pounding away at problems is more likely to work. Well, for some definition of "work," anyway.

* The fact that "To a man with a hammer, everything looks like a nail," is commonly attributed to Mark Twain, even though there's no extant Samuel Clemens work that includes it, may itself be an example of this. To the person for whom Mark Twain is the source of clever quotes, every clever quote is attributable to Mark Twain.

Monday, July 20, 2020

Sunday, July 19, 2020

As It Suits

There have been a number of consequences of the SARS-2 CoV outbreak in the United States. Among them, however, are yet to be shortages of food, clothing or housing. Now, this may only be a matter of time, but it hasn't happened yet. For all that the news media describes the current unemployment rate as "unprecedented" (their favorite word of the past few years), even with a sizable minority of the workforce sidelined, there is still enough labor to go around that the necessities are covered.

It's the people who work in discretionary industries who find themselves in precarious positions.

This is a fairly straightforward side effect of what division of labor looks like in an industrialized society. The days where a clear majority of people needed to be agricultural laborers just to keep everyone fed are long past. Most jobs, even those that we tend to think of as necessary, actually lie outside of the set of occupations that a society absolutely needs to keep its population alive.

While it's common to attribute modern poverty to capitalism, the reliance on discretionary spending is the more immediate culprit. The fact that providing an income to those that can't find productive work in a "free market" system is optional may be a problem, but the fact that productive work depends on the whims of people having disposable income is independent of the economic system that a given place may have adopted.

Friday, July 17, 2020

[Bracketed]

A couple of weeks ago, a person on LinkedIn asked a loaded question: "If the phrase 'Black Lives Matter' bothers you, why?" Most of the responses, to be sure, had more to do with the movement or the organization that people perceived to be behind it; "I don't have a problem with the phrase itself, but..." was a common opening to people's answers.

But in sorting through the responses, there were a number of people who did address the actual question being asked, and it confirmed my suspicion that while the actual slogan is "Black Lives Matter," what people often heard could be understood as "[] Black [] Lives [] Matter []." And the text that they understood belonged in one or more sets of those brackets was telling. I made some notes, based on the various responses, and boiled them down to a number of recurring themes, below:

  • Black Lives [only] Matter [when White people are doing the killing.]
  • Black Lives [don't] Matter [to Whites because they're still the bad people they were during Jim Crow.]
  • Black Lives Matter [because they're different from other lives.]
  • Black [as opposed to other oppressed/marginalized peoples'] Lives Matter.
  • [White people have to be told] Black Lives Matter [because they otherwise wouldn't figure it out.]
  • [There are people who actually believe that] Black Lives [don't] Matter.
  • [In order for] Black Lives [to] Matter [the United States needs a new President.]
  • [In order for] Black Lives [to] Matter [United States' society needs radical change.]
Of course, these aren't the exact wordings that people used in their responses. This is, to be sure, my understanding of the gist of things. And I'm not going to claim that the various authors would wholeheartedly agree with my paraphrasing of their  points.

The main value of the exercise for me was to understand what people felt was being asserted by the phrase "Black Lives Matter." Being only three words, it can be understood to be somewhat ambiguous, and in such cases, we would expect people to fill in the gaps. Of course, as I've filled them out here, many of these statements come across as straw man arguments. I'm sure that one would be hard-pressed to find a Black Lives Matter activist who would tell you that any of these was first and foremost in their minds.

But in reading what people wrote, and in trying to understand what it may have said about their thinking, I believe that I've managed to give myself a little insight into how people understand the world as it exists around them. And that's always a helpful tool in coming to agreements. Or in agreeing not to.

Tuesday, July 14, 2020

Means To Ends

But in our modern world, a world built on community, connection and the magic that comes from combining ideas, the opposite is true. When people deprive others of education and opportunity, they’re not helping themselves, they’re depriving themselves of the benefits that would come from what others would end up contributing. We don’t benefit from treating others poorly, we pay for it.
Seth Godin "Undoing the toxic myth of exclusion and scarcity"
And to Mr. Godin, I would make the same point I raised a week ago in response to Federal Reserve Bank of Atlanta Raphael Bostic. But there is also something else here. Opportunity hoarding has different effects on the collective or community than it does on the individual. When a particular family helps one of its members secure a well-paying job and guards that pay by ensuring that the number of roles remains scarce, they, as individuals most certainly benefit from treating others poorly. Sure, the pie might be bigger if more people could access opportunities to contribute "in the form of work product and innovation," but the family's relative slice would be smaller. And if that's how they're measuring benefits versus costs, then their actions make sense.

While a lot has been made of the idea that, to quote an academic abstract: "quantitatively, changes in relative income have much larger effects on happiness than do changes in absolute income," when it comes to happiness, it's understood that it isn't all about money. And in that sense, perhaps appealing to "the benefits that would come from what others would end up contributing" is barking up the wrong tree. If the main benefits of opportunity hoarding that people are after are something other than, or in addition to, simple material gain, then the question become whether those other things are also scarce. Some are, by definition; any race may only have one clear first-place finisher. Others not so much, but how does one understand the expansiveness of the supply?

Posts like this are predicated on the idea that if people genuinely understood the world around them, they would behave differently, because they'd have a better understanding of the best ways to reach their expressed preferences. But that, in turn, rests on the idea that people don't actually understand the interactions between their goals and the world. I'm not so sure about that part.

Sunday, July 12, 2020

Tiny Church

A very small chapel, along the roadside. It's maybe 6 by 10 feet. But it has pews and a lectern, in case you want to have a service for everyone in your car.

Friday, July 10, 2020

Only Three Words

If you are a child of God, you are my brother and sister. I have family of every race, creed and ideology. We must ensure #blacklivesmatter doesn’t morph into #blacklivesbetter
Terry Crews. Twitter.
In the same way that much of the debate over "Black Lives Matter" appears to be over whether it means "Black Lives Matter (as much as anyone's)," or "Black Lives Matter (more than other lives)," one can note that "Black Lives Better (than they are now)" is not the same as "Black Lives Better (than everyone else's)." A call for improvement is different than a call for superiority. It's worthwhile to keep in mind that slogans are sound bites, not substantive policy proposals. After all, each of these are only three words.

American society has, for much of its history, operated on a policy of cost-shifting; the mainstream (or, if you prefer, the power élite) has enjoyed the benefits of a growth in access to resources and wealth in part by shifting the costs of those gains to others. Whether that was the practice of slavery, denying property rights to the native population or open discrimination against immigrants, leaving others to hold the bag was commonplace. (Although it is worth noting that "commonplace" is not the same as "widely acknowledged" or even "generally understood.") One can imagine that a side effect of this understanding, especially coupled with an unwillingness to see the people who benefited from (and perhaps even drove) such policies as deliberately Evil, is the idea that this is simply part of Human Nature; "Power oppresses," as it were, "and absolute power oppresses absolutely."

It's recently become trendy to say: "When one is accustomed to privilege, equality feels like oppression." And the general intent is to paint the formerly privileged as seeing others manage to reach the same level as a form of injury. But if it's understood that "privilege" is the byproduct of oppression, and "equality" means that the shoe is now on the other foot, the comparison may seem to be more apt. In other words, it's one thing if the pie becomes bigger so that the people who had small slices now have larger ones. But if the larger slices are divided up and handed out to those with less, that may feel more legitimately like a wrong.

I recall this quote from a voter in New York in the run-up to the 2008 Presidential Election:
I don't want to sound racist, and I'm not racist. But I feel if we put Obama in the White House, there will be chaos. I feel a lot of black people are going to feel it's payback time. And I made the statement, I said, "You know, at one time the black man had to step off the sidewalk when a white person came down the sidewalk." And I feel it's going to be somewhat reversed. I really feel it's going to get somewhat nasty. Like I said, I feel it's going to be - they're going to feel it's payback time.
The idea that a society moves from a state where different groups prey upon one another to a state where they cooperate with each other depends on a sense that humanity is capable of making noticeable (and somewhat even) moral progress. People need to be able to both forgive and accept responsibility for past wrongs. If, instead, people believe that humanity simply moves through cycles of different groups ascending and declining, ascendancy is something to be celebrated, but decline is something to be feared, because it means that the bills for all of the abuses of that prior ascendancy are going to come due, and only remaining on top allow the pain (justified or not) to be avoided.

Wednesday, July 8, 2020

Bad Ideas

Harper’s Magazine will publish, in its October 2020 issue, A Letter on Justice and Open Debate signed by a number of writers and academics. One of the key points that it makes is as follows:

The way to defeat bad ideas is by exposure, argument, and persuasion, not by trying to silence or wish them away.
I suspect that the audience people perceive need to read this (which may or may not be the target audience of the letter) will find that unconvincing. There could be multiple reasons for this.

The first could very well be that it hasn’t yet been shown to work. For the person who believes that exposure, argument and persuasion have yet to fully realize “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness,” after 244 years, there has to be some rationale given as to why today is so different. What about people has changed such that the bad ideas that survived being exposed, argued and persuaded against for all that time are now suddenly vulnerable?

The second might be a disagreement on the nature of “bad ideas.” To a degree, the point behind the marketplace of ideas is that there are no clearly wrong ideas. A “bad idea” may be one that doesn’t suit the moment, or doesn't meet people’s needs, but those are different than openly harmful. In the marketplace metaphor, people knowingly peddling “attractive nuisances,” in the sense of things that are actively dangerous yet outwardly desirable, are rare to non-existent.

In Americans Aren’t Practicing Democracy Anymore, in The Atlantic, Yoni Appelbaum (who is, interestingly, not a signatory to the letter) notes that “‘Democratic government, being government by discussion and majority vote, works best when there is nothing of profound importance to discuss,’ the historian Carl Becker wrote in 1941.” People don’t commonly think of high-stakes moral issues as being “nothing of profound importance.” In order to push people towards being more accepting or discussing things in the marketplace of ideas, it’s not enough to call for that discussion. People have to also understand that losing the debate is an acceptable outcome. I’m not sure that one would find widespread agreement with the statement “People are generally united in their answers to important ethical questions.” And even that presumes that people are united in their understanding of what the important ethical questions are.

When Karl Popper described the Paradox of Tolerance (“In order to maintain a tolerant society, the society must be intolerant of intolerance.”) he noted: “I do not imply for instance, that we should always suppress the utterance of intolerant philosophies; as long as we can counter them by rational argument and keep them in check by public opinion, suppression would be most unwise.” But this does not itself offer a test that can be applied to determine which intolerant philosophies can be countered by rational argument and/or kept in check by public opinion. A universal and unwavering commitment to a norm of open debate and toleration of differences would imply that it is at least somewhat self-evident, or otherwise clearly established, which ideas are objectively bad.

If one actually believes in morally impermissible ideation, a reliance on exposure, argument and persuasion must then be seen as infallible routes to preventing their adoption. If failure is unacceptable, then it must be impossible. And history argues against that interpretation. In a society in which people commonly refer to their fellow citizens as unintelligent, undiscerning and/or unethical, I’m not sure that it’s reasonable to presume that people will conclude that bad ideas are doomed to failure and lack of adoption simply because they are bad. While it's true that a culture of ideological purity may sanction some people who do not deserve sanctioning, as long as the stakes are high, their sacrifice will be seen as an acceptable price.

The Enemy of the Good

The Atlantic's David Graham has outlined the backlash to the way the Trump Administration has implemented the Paycheck Protection Program. In short, the legislation was written with significantly broader eligibility requirements than many would like, and this has triggered withering criticism that the PPP has enriched the already wealthy while ignoring "the people." In the end, Mr. Graham concludes: "The backlash against a successful government program is why the United States can’t have nice things."

I would disagree with this assessment, although perhaps not on the merits. It's true that there appears to be an expectation that government can, and therefore should, perfectly calibrate the assistance it gives, so that only the "deserving" are aided. (This is something that predates the PPP. And it's valid to be critical of the fact that when it comes to individuals, there seems to be a greater willingness to err on the side of withholding assistance.) And it's true that widespread criticism of government missteps tends to lead to an ethos of "don't just do something, stand there;" if only the perfect is at all acceptable, then nothing will happen until the people in charge of crafting it are convinced that it is absolutely above reproach. But in that, the backlash is a symptom, not the disease. The United States can't have nice things because it doesn't want nice things; it only wants perfect things. And it believes that it can get them. If only the perfect is an acceptable alternative to the status quo, then the status quo will tend to reign, because the perfect is difficult to attain in a world in which people themselves are commonly, if not universally, imperfect. If sweeping legislation must be quickly written and enacted while also complying with a specific, and often complex, understanding of the demands of justice, or be otherwise panned as worse than having done nothing, doing nothing becomes a very attractive option.

To a certain degree, having nice things requires accepting at least some of what one has as "nice." Being able to compare a real item, be it a physical item like a car or a more abstract item like legislation, to an ideal is a recipe for disappointment, because ideals, by definition, are freed from the requirement that they operate in the real world and be implemented by real people. They can simply be called in existence, and assumed to be workable. And if unexamined, idealized, conceptualizations of items become the definition of "nice," then nice things will become vanishingly rare, by any standard. Because the nice things that pass reality's muster won't measure up, and the ideals are unlikely to ever be anything other than ideals.

Monday, July 6, 2020

Dragging

Systemic racism doesn't just hurt individuals and limit their opportunities — it also creates a lasting drag on the economy, writes Atlanta Fed chief Raphael Bostic, the first Black president to lead a regional Fed bank.
Fed examines racism's economic toll
Unfortunately, it's not much of an examination. While A Moral and Economic Imperative to End Racism does note that racism means that people impacted by it contribute less to the economy than they could have otherwise, that simple observation seems fairly straightforward. It's fairly easy for anyone to claim that someone languishing in long-term unemployment would be contributing more in a well-paying job.

But it seems unlikely that if the United States actually needed greater economic contributions "in the form of work product and innovation" from it's citizenry, that it wouldn't have found a way to put its currently excess labor force to work and realize those contributions. The problem that racism causes for a national economy isn't that it lowers GDP. If anything, it's that the discontent that a skewed distribution of poverty creates siphons resources away from more productive uses. Between police overtime and insurance claims for burned-out vehicles and buildings, there are clear expenditures that would have been better spent on more productive activities. But while the Glazier's Fallacy might state that the money spent replacing cars is poorly spent overall, the simplest answer to that is that if the protestors would simply accept their lot, there would be no need for added expense. And even so, it's a shift of economic activity, rather than a straightforward drain. And so the idea that systemic racism is a drag on the economy is not as clear-cut an idea as it might seem.

Because the demand for labor, goods and services is not infinite, racism doesn't have to mean that a society is willfully leaving money on the table, simply to spite some or another sector of the populace. If there is a presumption of rationality, then growing the demand for people to contribute work product and innovation is how one defeats racism. And if there isn't a presumption of rationality, then pointing out a hit to GDP is unlikely to work in the first place.

Saturday, July 4, 2020

Reliance

The SARS-2 coronavirus pandemic had prompted from fairly radical measures from governments worldwide as they attempt to slow or stop the spread of the disease while medical science applies itself to the task of searching for a vaccine. There has been some amount (although not as much as one might think from watching or reading the news) of pushback against such measure here in the United States, from people who cite the American values of freedom, liberty and independence as reasons to do they will.

This, in turn, has lead to a what seems to be a cottage industry of articles that seek to define (or redefine) these terms in relation to community and one's responsibilities to the community. Which I understand. When people live in communities such that an individual's actions can have consequences for the whole, there is a push for people to sublimate their individual interests in favor of those of the group.

Generally speaking, when it comes to an individual, terms like freedom, liberty and independence don't necessitate being completely cut off from groups of people. But they do mean that the associations are voluntary. And for most people, that simply isn't the case. It's exceedingly difficult for a person today to be genuinely independent of other people; the infrastructure that supports most people is expansive. And this makes it not only the work of many different individuals, but nearly impossible to truly escape. While it's not impossible to be a hermit, and create a life for oneself completely "off the grid" and self-reliant, it's a lot more difficult that it was 200 years ago.

When David Brooks set off a teapot tempest with his article "The Nuclear Family Was a Mistake," he was clear on what he felt were the advantages of an older model that offered greater social connection. What he glossed over were the reasons why people walked away from it. The piece starts with a recounting of a scene from the Barry Levinson film Avalon. "The big blowup," we are told, "comes over something that seems trivial but isn’t: The eldest of the brothers arrives late to a Thanksgiving dinner to find that the family has begun the meal without him." Said eldest brother bridles at the disrespect, and Mr. Levinson confirms to Mr. Brooks that it was, in fact, disrespectful for the family to start eating prior to the brother's arrival. Nothing is mentioned about whether it was disrespectful of the bother to not ensure that he was on time.

The American colonies sought independence from Great Britain because they felt that the relationship was one-sided. Great Britain expected its overseas holdings to meet their obligations, yet, in the eyes of the colonists, the Crown shirked its responsibilities to its faraway citizens. One expects, of course, that King George the Third and his court saw things somewhat differently.

Such is the same today. It's all fine and good to lecture people on their responsibilities to the collective. But people are going to push back against that if they believe that the collective feels no responsibility to them. Some of this is going to be inevitable, and some is going to seem distinctly unreasonable; there is always someone who's definition of "tyranny" is having the majority vote against them on where to order an expensed office lunch. But the collective ignores the discontent of its constituents at its own peril. One can imagine that Great Britain would have been much better off had it maintained the fledgling United States within the greater Commonwealth. But it is the habit of people to more attuned to what they feel they are owed than to what others may understand their debts to be. And as the United States slides further and further into factionalism, one of the side effects is that people no longer believe that the nation as a whole feels any need to look out for them. And for some, being abandoned is also a form of independence.

Belonging

I think I was in grade school when this first occurred me. I understand, that if a pre-teen can have this insight, that it might seem blindingly obvious, but, as my father always told me, "'Obvious' is something so crystal clear that you're the only person who see it." So I'm discussing it here today.

When I was in grade school, I had a lot of prejudiced classmates. Of the sort that we would likely call "ethno-nationalist" today. Americans were simply better than non-Americans, and Whites were better than non-Whites. They, as White Americans, were therefore better than everyone else. And their rationale for being better was that they could rattle off, from memory, lists of White Americans who had accomplished Great Things, and their argument to others was the inability to rattle off a more impressive list was proof that they weren't as good.

"Well, sure," I would say in response. "But what have you done, that makes you as good as those people?" About this point is where tempers would flare.

But to me, the point was clear; people judged themselves as better than other people based on membership in a group that had important members, rather than having actually done important things themselves.

The difference between a third-grader and a thirty-year-old in doing this is noteworthy, but perhaps not critical. The thirty-year-old may be convinced that their membership in a particular nationality or ethnicity means that their current less-than-ideal circumstances are the result of them being cheated of their birthright. For the third-grader, it's the establishment of the idea that they have a birthright, and that others should respect that. Both are ways of dealing with scarcity, but for children, that scarcity may not only be an idea, a concern for the future.

Thursday, July 2, 2020

Past the Post

One of the weird things about the United States (although I suppose that it may be true for other countries, as well) is an apparent tendency for people to view competitive situations as if they were being judged against static benchmarks, rather than how other people perform.

This often comes up when talking about job hunting. Or hiring, for that matter. There is an impression that for people who can pass a given benchmark for competence, demand for their services is effectively unlimited, and therefore, not being chosen is an indicator of lack of competence. But this isn't the way the hiring process works, especially now, when it's an employer's market. An employer may list a certain set of requirements in order to be considered, but I've never come across an employer that has committed to hiring everyone who meets the bar. There is almost always a set number of openings to be filled, and once the interviewing is done, people are ranked until the open roles is met. In an employee's market, it's possible that employers lower the requirements, or take the top person(s) interviewed, but then again, the stated requirements for the role are somewhat secondary.

This isn't a new phenomenon, or one limited to employment. For all sorts of areas in which a certain level of exclusivity is desirable, but the appearance of élitism is not, there is a tendency to act as those entrance relies on attaining a simple benchmark, rather than being the first to obtain a scarce open spot. I suspect that it has something to do with the somewhat common idea that "average" or "commonplace" should be considered a synonym for "hopelessly mediocre." But in the end, it's likely just another manifestation of a desire to see the world as just. If success is available to unlimited numbers of people, so long as they can reach some or another benchmark, then people can congratulate themselves on their success, and convince themselves that anyone else can do it.