Thursday, August 17, 2017

Then and Now

I am not a student of American history. I took the classes that I needed to in school, and here and there I've read some nonfiction about this or that period, but I don't seriously study the history of the United States. There are only so many hours in the day, and they're often occupied by other things - like writing this blog, for instance. And so my understanding of things tends to be shallow, the sorts of things that either everyone knows, or are easily picked up by paying attention to everyday sources.

A few years back, I was in an online debate with a man who claimed that the Presidency of Barack Obama marked the end of the American commitment to the values of Life, Liberty and the Pursuit of Happiness. Falling back on my limited grasp of American history, I offered up the examples of the attempted extermination of the American Indian, the internment of the Japanese and the Chinese Exclusion Act as historical examples of American failures to live up to those ideals. And I stressed that these were things that happened in the past, and that in moving past them, the United States showed that as time went on, it grew more committed to those ideals, and that nothing in the policies of President Obama could be seen as serious moves back to that past.

And this is my standard pattern with such historical events. I tend to leave out the trans-Atlantic slave trade, Jim Crow and the like. Mainly because I think that pretty much everyone already knows about them. But also because I don't want to focus on the negative aspects of "my own" history. I see no reason to constantly harp on what happened to "us," as it were. There was enough of that when I was growing up, and a side effect of it was a lack of empathy towards others and a certain need to always win at Misery Poker. And here's the thing about Misery Poker - you have to be miserable to win. And somewhere along the line it occurred to me that I didn't want to always be miserable, just so I collect whatever dubious prizes that Misery Poker offered.

The other day, an acquaintance of mine was holding forth about how terrible it was that someone dared say that the treatment of the Irish in days past (the 1840s being perhaps a good example) was worse than the treatment of Black people today. My acquaintance, and a number of their online friends, didn't even bother with laying out the hands of Misery Poker, but rather decried how anyone could say that anyone but the Black community would win. But being an indifferent student of American history, it seems to me that perhaps, given the choice to be Black today or Irish in 1845, I'd choose to be Black today. (I am, after all, rather enamored of automobiles, air conditioning and the Internet.)

Because here's the thing: The fact that it may have been worse to be nominally White at some distant point in the past does not mean that it is perfect to be Black today. And the commenter hadn't made the point that being Irish in the past was worse than to be Black at that same time - merely that the Black population of now is better off than the Irish population of then. And in so doing was pointing out the very thing that I had done some years back - noting the growing commitment of the United States to its ideals of Life, Liberty and the Pursuit of Happiness.

Now, to be sure, I'm not a naïf. I understand, given the context of current events that the commenter likely meant to minimize the situation of today's Black Americans in the service of painting us as undeserving of the accommodations granted to us. But be that as it may, it still may be true that the modern United States treats its marginalized better than the pre-Civil War United States treated those much closer to the mainstream, but still unfortunate enough to not be within it. A truth put to an unjust end is still true, and should be judged on its merits, not those of the speaker.

Wednesday, August 16, 2017


Among the definitions of "rationalize" put forth by the Merriam-Webster online dictionary are: "to attribute (one's actions) to rational and creditable motives without analysis of true and especially unconscious motives" and "to provide plausible but untrue reasons for conduct." And this fits in with the way Americans normally speak about rationalizing something. In common usage to "rationalize" is seen as a variety of deceptive self-justification; a way of taking an action that one understands to be wrong and filing off the rough edges until it fits (even if this takes some effort) into the slot labelled "Right."

Overall, the general understanding is that rationalization is a bad thing, leading, as it does, to people convincing themselves that wrong is right; accordingly cautions against it are not difficult to find.

[Art] Caplan draws a wise lesson from the Nazi doctors: Beware the human weakness for moral rationalization. But part of that weakness is the illusion in each of us that we have escaped it.
William Saletan ("Natural-Born Killers" Slate Magazine, 4 April, 2005)
But part of the knock on rationalization is the assumption that people are capable of creating workable moral and ethical frameworks that are 100% objective constructs and need have no recourse to a person's understanding of what it should look like. Given that people tend to resist purely mathematical expressions of morality and ethics, this strikes me as a standard that few, if any, will reach.
Now, normative cultural relativism might sound pretty good to you; it does at first to a lot of people. Because it seems like it’s all about inclusiveness and tolerance. Who am I to tell other cultures how they should live, right? But this view actually has some pretty big flaws.

If every culture is the sole arbiter of what’s right for it, that means no culture can actually be wrong. It means Nazi culture actually was right, for the people living in that culture. A dissenting German voice in, say, 1940, would have just been wrong, if it had claimed that Jewish people deserved to be treated the same as other Germans.
Hank Green "Metaethics: Crash Course Philosophy #32"
Mr. Green's statement that normative cultural relativism "has some pretty big flaws" due to the fact that it allows Nazi culture to right, at least as far as the Nazis themselves were concerned at the time, could also be understood to be a form of rationalization. After all, Mr. Green was teaching a course on Philosophy, and not is own personal understanding of right and wrong. Yet he allowed himself to challenge the concept of normative cultural relativism, and by extension, all of the Moral Antirealist stances, not based on the some inherent contradiction that he noted in them, but because they didn't automatically condemn the Nazis as wrong, and in doing so, he allowed himself to work backwards from the desired endpoint of Nazi wrongness to a moral viewpoint; selecting one for himself that agreed with the preconceived notion that some acts are inarguably wrong.

And I'm okay with that, leaving aside the fact that it's poor form as teaching. Because for many people, this is how moral and ethical viewpoints are reached. People understand what they want to be inside and outside of the bounds of acceptable behavior, and they select a moral viewpoint that comports with that. And it's possibly this habit that Mr. Green had in mind (whether he was aware of it or not is a different story) when he confidently told his audience "most people you know – including yourself – are committed to some form of moral realism."

The idea that some items are morally wrong simply as a matter of moral fact can be a comforting and useful one. But there may be some value in the idea of an individual actually deciding to elevate themselves to the post of moral arbiter, rather than outsourcing that to nature, a society or an institution.

In this cartoon, the left-hand speaker, acting as an author avatar, decides that the simple act of holding a flag from Nazi Germany should be considered a form of incitement, and thus ineligible for free speech protections, based on little other than their own moral intuitions. The person holding the flag has no dialog within the graphic: they never speak and we, as the audience, are never let into their thoughts. Likewise, we never see the flag acting as an incitement, there is no one to respond to it. The two interlocutors who observe it, however, are clearly not at all inspired to recreate the Third Reich - outside of wanting to punch the flag carrier in the face and strip them of free-speech rights, they are not moved to action. In the end, the character with the flag may as well simply be a statue or a projection; they're static and show no impact on the world around them. We are meant to infer the evidence against them from real-world events, which are themselves not referenced.

Because of this, the incitement argument strikes me as a rationalization, and that rationalization is born of the fact that effectively saying "I find would-be Nazis so awful that I think the rules shouldn't apply to them," is somehow off-limits, despite the fact that loudly proclaiming that one would simply punch them in the face is often considered acceptable. The number of people in my social media circles who have done so is substantial. Rather than putting time and energy into a mental gymnastics geared towards finding would-be Nazis to always be guilty of incitement so that acts against them may be reasonably judged self-defense, it seems more constructive to simply declare them persona non grata for being deplored, and when challenged, simply trust in the accuracy of the moral sentiment in question.

Because that appears to be what's actually going on. And its what always goes on. In part because American society always manages to convince itself that it should be above such things. But that's not how people work. I suspect it's better to be okay with that than it is to hide it under a pile of unnecessary rationalizations. To the degree that people trust their moral intuitions, let them trust them. We'll have a more honest society for it.

Sunday, August 13, 2017

Picking Sides

Evan McMullin, a former CIA officer who ran as an independent against Trump in 2016, had among the strongest condemnations of Trump’s statement of politicians on Twitter, saying Trump’s vagueness about who is to blame signals “positively to the white supremacists whose support he enjoys.”

Trump has been heavily criticized in the past for not doing more to condemn the hate groups that support him, including [former Grand Wizard David] Duke and the Ku Klux Klan, which endorsed him during the campaign in 2016. And his presidential campaign was bolstered by the resurgence of the so-called alt-right and characters like white-nationalist Spencer.

Indeed, Duke later responded to Trump’s statement on Twitter, telling him, “I would recommend you take a good look in the mirror & remember it was White Americans who put you in the presidency, not radical leftists.”
The Hidden Meaning of Trump’s Charlottesville Remarks
The older I become, and the more I come to understand how politics works, the more sympathy I have for politicians. Donald Trump has already admitted to finding the job itself more difficult than he expected it would be. I suspect that he has also learned (even if he has been less forthcoming about it) that the Faustian bargains that one makes in campaigning come with more difficulties than he’d initially come to believe.

As we move farther from the actual election, the fact that President Trump won the Electoral College but lost the overall popular vote becomes less salient, except perhaps to President Trump himself. But as a continuing political matter, it’s still front and center, and not simply because the needling of his Ego prompts the President to look for ways to re-litigate the election via repeated accusations of comically-mistargeted “voter fraud.” White supremacists don’t have to be a particularly large segment of the population to have been the coalition partners who put the President over-the-top in one or more of the states he carried. And while it’s entirely possible that had they all stayed home, Candidate Trump would have still carried the day, David Duke seems to think otherwise; and the President just might agree with him. In which case, he may not enjoy their support at all, but he needs it. Coming out and laying the blame for the events of Charlottesville, Virginia squarely at the feet of White hate groups, or even simply publicly labelling them as hate groups, may mollify some of the President’s critics for a time, but runs the risk of alienating the people who form the spine of much of whatever leverage that the President has left at this point.

Given that President Trump has embarked upon a policy of being the President mainly for people who will directly support him, his fortunes are, at least in the short term, tied to the strength of that support. Accordingly, there’s little benefit for him to undermine that support by agreeing with not only his critics, but his supporter’s critics.

Friday, August 11, 2017

Tolerably Intolerant

With the induction of "intolerant" into the American arsenal of low-grade pejoratives, the term is bandied about with a fair amount of regularity, having settled into an ironic definition of "a person or group of people, who due to their unjustified closed-mindedness, can safely/should be ignored." And when another random argument/shouting match about who qualifies as genuinely intolerant pops up, I'm reminded of this David Horsey cartoon from his days in Seattle.

Political/social arguments about tolerance tend to seem like challenges to a game of Russian Roulette, with each side claiming that the other is intolerant for not cheerfully accepting ideas that can be generally considered as directly aimed at undermining their worldviews, legitimacy and leadership. It's worth noting that this isn't always intentional bad faith. Political rhetoric can be remarkably layered and nuanced, and to those who aren't interested in peeling all of the layers of the onion, the fact that a topic is apparently off-limits may seem arbitrary, rather than serving a purpose. Bad faith abounds in politics and society, however, and so is the basis of many a sneering critique of the opposition's tolerance.

Generally speaking, in American politics, the Left holds to tolerance, which may perhaps be described as a mix of social, political and religious laissez-faire ("If it harms none, do as you will."), as an affirmative virtue. It comes across less valued on the American Right, except as a defense against the pejorative description of the right as intolerant. And what appears to drive many arguments about tolerance is a basic disconnect between what the sides themselves understand as harmful, and what the other side is willing to understand is harmful.

To use Mr. Horsey's cartoon as our example again, a stereotype of the American Right is that they find abortion, alternative sexuality, non-Christianity and Socialism to all be active harms of one sort or another. And while they may concede that these things exist and are unlikely to go away, the stereotypical Right-leaving echo chamber holds that support for these marks a person as perverse to one degree or another, and that if allowed free reign, their agenda will eventually erode the foundations of civil society. Likewise, a stereotype of the American Left is that they find homophobia, militarism and religious fundamentalism/zealotry to all be active harms in their own right. And therefore, the stereotypical Left-leaning echo chambers holds that these items are the marks of a perverse and harmful agenda.

Of course, since, in the end, each seeks to supplant the other, each stereotype sees its own viewpoints through a lens that carries very stringent and narrow understandings of "harm," and doesn't allow for simply undermining the other side to count. If it's not making the streets run red with blood, then it does no real harm to anyone. By the same token, as each stereotype sees itself as a affirmative good for society, work and ideas designed to undermine it, do come off as harmful. The stereotype of the American Right sees religious pluralism as a path to a world lacking in any moral constraints; while the Stereotype of the American Right sees the hegemony of one faith as a prelude to the end of free thought.

It's unlikely that either side will refrain from casting tolerance as the other showing a lack of commitment to what they consider "Truth," anytime soon. Nor will they concede the risk that they demand the other take. Because these things are all part of the script now. And more than anything else, the Stereotypes always stick to the script.

Wednesday, August 9, 2017

Time Marches Slowly

So the firing of Google engineer James Damore for his lengthy treatise challenging Google's diversity programs and the echo-chamber/code of silence that's built up around them has been all over the place this week. And in one (non-public) social media post I was reading on the topic, one of Damore's defenders claimed that "since the seventies" we'd effectively traded one form of discrimination for another, and asked when would it stop?

So - in the 1970s, my mother applied for a teaching position in the town I grew up in - a distant suburb of Chicago. She was rejected, because at this point in the early 1970s, they weren't hiring Black teachers. I hadn't quite started school at this point. I didn't have any Black teachers until college. Because what were the schools going to do? Fire some of the teachers they already had to make room for the people they hadn't been hiring previously? Unlikely.

But it's interesting. Because many the kids I went to school with looked at the world in the same way, even if it was over a much shorter timeframe. When I was a freshman in high school, the senior class were mostly of an age where they were born in 1964. As in the Civil Rights Act of 1964. The number of my classmates who felt that in less than 20 years, all vestiges of racial and ethnic discrimination in the United States has been wiped away was astonishing. And this idea, that things outside of our own living memory don't matter, takes some maturity to get rid of.

Because would you expect that an organization that started off with no non-White employees would now have an employee makeup that matches the community around it without some continuing measures? Especially given that numerical quotas were not allowed? A new teacher who started working in the public schools of my hometown when I was in first grade could conceivably still be working there. I know that some of my high-school instructors are still with the school (although they've moved up the hierarchy in the meantime). When would you think the last school administrator who was an active participant in "no Black teachers" finally left the district? 1975? 1985? 1995? And let's not forget the last teachers hired under that rule. And in that timeframe, how many people do you think they influenced? There is an idea that in order to harbor negative stereotypes about one or another group of people, you have to be a snarling bigot, with a pointy hood folded up in the bottom dresser drawer. And that to be influenced by such people, one has to live in a place that modernization took a pass on. It's a convenient stereotype, but not an accurate one. There is a tendency to see people as immune from being influenced by bad ideas unless they somehow show themselves to be monsters. And to the degree that the monsters are viewed as relics of the past, the bed ideas are often assumed to have died with them. Even though it's understood, for example, that there are still people who believe that the Earth is flat... And in my own life, I have been much more likely to encounter people willing to share (and attempt to sell me on) the idea that people's life outcomes are shaped, wholly or mostly, by nature than the idea that the South Pole is a hoax.

In the end, the problem with diversity may be that it's not attempting to solve the right problem, because it takes, as a starting point, the intractability of scarcity. If you view various "isms" as responses to scarcity that take on lives of their own after being allowed to take root deeply enough, then it seems that the best way to combat them is to do something about the scarcity. And that's a difficult thing to tackle because so much of our culture is designed to run on scarcity.