Thursday, March 29, 2018

Work For It

So I came across an article, provocatively titled “The War on Work—and How to End It.” In case you couldn't guess, the article has a fairly conservative bent, and, as such, it rather neatly lines up with conservative orthodoxy when it comes to it's subject - how to get more people back into the labor force.

Consider the following knock on the minimum wage: "Advocates of a $15 minimum hourly wage, for example, don’t seem to mind, or believe, that such policies deter firms from hiring less skilled workers."

And this complaint against the unemployment benefits in general: "Increasing the benefits received by nonemployed persons may make their lives easier in a material sense but won’t help reattach them to the labor force. It won’t give them the sense of pride that comes from economic independence. It won’t give them the reassuring social interactions that come from workplace relationships. When societies sacrifice employment for a notion of income equality, they make the wrong choice."

But what's interesting about this is that Professor Glaeser doesn't seem to think that less skilled workers should just be cast adrift; after all, he notes: "Nominal wages actually fell over both the 1870s and the 1890s because workers had to accept low pay. With no government safety net, long-term unemployment meant deprivation—or even death." So what's the answer?

Making work pay needs one final, major policy initiative: wage support, which would replace the [Earned Income Tax Credit]. The EITC had the right overall idea, but it is cumbersome and indirect. Instead, the federal government could simply provide pay to increase the earnings of minimum-wage workers by a fixed amount—say, $3 per hour. Consequently, a worker paid $7.25 would take home $10.25 hourly, with the difference paid for by taxpayers.
This raises a question for me - Are workers not going to be told where that extra $3 an hour is coming from? Because I don't see how someone who's miserable due to lacking "the sense of pride that comes from economic independence" if they're drawing unemployment, disability insurance or just depleting their savings is suddenly going to find it in a job that only pays enough to live on because of taxpayer dollars have been reduced in amount, and given a different name. And this becomes a subsidy to employers, who can then push wages down because they know that someone else will make up the difference - which WalMart (among others) has already been accused of doing. And since "Such a program would be expensive, so it should be matched with spending reductions for other social services," if something happens to this poorly-paid job, the guy is in real trouble. (And I say guy, because the focus of Professor Glaeser's piece is "prime-age" {between 25 and 54} men. I have no idea what he has in mind for women and children.)

It's worth noting that Professor Gleaser believes that while long-term unemployment in a welfare state doesn't man deprivation to the degree that it used do, it still means death - he links high rates of unhappiness and suicide to unemployment. But he feels that you can't realistically do away with both, so he'd rather have deprivation. Which is logical. But the insistence that inequality not be considered a factor in the current dismal state of affairs strikes me as ideological.

I detect (correctly or not, I am uncertain) that old conservative trope of "Makers and Takers" at work. There is a clear presumption that with the right incentives, that entrepreneurs can come up with enough work that requires physical presence in (or near enough to) the place where the good or service must be delivered to take up all of the current slack in the labor market, including small businesses in low-income areas. But even if you take that at face value, and a lot of smart people are dubious about that, this only works if this entire enterprise lifts, if not all boats, the boats at the lower end of the spectrum. A small business in a low-income area, in order to succeed, has to be in an environment where the low incomes of the residents are not so low that they're unwilling or unable to buy whatever's on offer.

For an individual low wage worker, the point of a $12 an hour wage with $3 pitched in by taxpayers or a $15 an hour minimum is to raise that worker's income to the point, where, hopefully, they can contribute to the overall economic health of their area without risking destitution if something goes a bit sideways on them. As production of goods and services becomes more efficient, the number of people needed to produce the necessities decreases. Therefore, more and more people rely on the ability to produce discretionary items, whether you call them simply non-essentials or label them as luxuries. Either way, people have a choice as to whether or not they're going to buy them. Poverty creates a problem in that it restricts the ability of people to make discretionary purchases, because, generally speaking, very wealthy people don't spend as much of their available wealth on these things. As Nick Hanuer pointed out, he might have 1,000 times more money than the average person, but he doesn't spend 1,000 times more on cars.

You can make the point that the wealthy assist everyone with their investments, but they often want returns on those investments that are rather much higher than, say, the rate of inflation. And to the degree that the rate of inflation measures the growth of the money supply when compared against the availability of goods and services, one can legitimately question whether or not those returns are simply extracting wealth from the society as a whole.

A high minimum wage is designed to be a form of wealth transfer from the capital to labor. The fact that it tends to increase unemployment is indicative of the fact that wealthy people can choose to forgo investing when the payoff is not to their liking, because they face little consequence for doing so. In the late 1800's the lack of a social safety net meant that if an employer decided that the return on investment wasn't to their liking, it was the workers who faced hardship. And that lead them to compete with each other to raise the employers returns. Minimum wages are designed to do away with that.

Now, I understand that a lot of people will tell me that I'm incorrect about this, but it seems to me that at some point, there will have to be a better return on labor, as opposed to capital, for the system to thrive in the long term. And that means there have to be more situations were capital can't simply walk away from a deal, or insist that labor share their pieces of the pie more evenly. I'm not one of those "eat the rich" types, because I have no problem with people being wealthy. But I do understand that an unchecked imbalance leads to problems like unemployment.

Professor Glaeser notes: "While droughts and pestilence often threatened disaster, joblessness was no part of then-rural America. If you didn’t work, you starved, and there was always another patch of land to hoe and seed. Unemployment arrived only when workers moved to cities." Note that for this to be true, it has to be due to cities being more efficient than rural areas, as the assumption is that urban areas could support non-workers that rural areas could not. The reason, however, why there was always another patch of land to hoe and seed was that you could always move somewhere that other White settlers had yet to set up shop, drive any natives from the land, and take it for your own. Had settlement been more confined, however, it's entirely possible that there wouldn't have been another convenient patch of land to take over. And in that situation, you have what we have now, rural employment due to the unavailability of "free" land for subsistence agriculture. Which, one may point out, was typically the primary driver of movement to cities. Unemployment didn't arrive when workers went to cities - many workers simply moved it with them and it accumulated in cities when some number of them were unable to leave it behind, because going back to a rural area where there was no means of supporting oneself made little sense.

Unemployment is a result of their being more people than are needed to do the work of supplying the available demand for goods and services, whether that's due to low aggregate demand on the consumption side, misinvestment on the labor side or some mixture of the two (which are the same thing, really). The way one combats unemployment is to redress that imbalance. And there are a lot of ways of doing it. Personally, I think that the returns on labor and capital have to be more equal than they are now, but that's just me. And while I don't know that we can effectively legislate that balancing (I think we can't), I'm pretty sure that pushing in the other direction won't be helpful, either.

Wednesday, March 28, 2018

Contradiction

I was having a discussion with a person who was telling me about how the marginalized shouldn't be viewed as having lesser value than any other person. My response was: "But that's why they're considered marginalized, isn't it?"

In the end, it was a matter of understanding and definition. I'm not one for the idea of intrinsic worth - for me, value is determined subjectively, by other people, and we value some people (or what they can do for us, anyway) more than we value others.

But there was a part of me that wondered if it wasn't words like "marginalized" and "powerless" that were the issue, because they informed people of what other people thought.

Monday, March 26, 2018

The Obvious Argument

When I was somewhere in my pre-teens, my father and I were talking, and he'd asked me about some or another thing that I'd said, and I replied, "Dad, it's obvious."

To which my father replied, "Obvious, Aaron, is something that's so crystal-clear that you are the only person who sees it." Of course, he was right about this, and as I've grown older (I'm now older than my father was when we had that conversation), I've come to appreciate the wisdom of that statement more and more.

And so I tend to be disappointed when people trot out "obvious" in social and/or political debates arguments. (This includes religion.) Mainly because it's usually deployed when the person speaking, for whatever reason, has no intention of supporting the argument, point or fact that they've just uttered. The really bummer for me, however, is when "obvious" is used to close off the conversation. In this context, one can imagine connotation of "obvious" being self-evident not only to the degree that no proof is necessary, but to the degree that even requesting proof is a sign that one lacks the intellect, discernment and/or good intent to accept any proof that would be offered; therefore, none will be.

And sure, sometimes, this is merely a smokescreen to cover the fact that someone has made a statement of faith, or born of their worldview, and they've never actually looked into it deeply enough to know, or be able to articulate supporting evidence. Accusing the other of being too stupid, gullible or false-hearted to accept proof can be a useful, if insulting, way of hiding ignorance. But it can also simply be a way of casting the other person as too "lesser" to be bothered with. And that's something that our discourse already has enough of.

Saturday, March 24, 2018

Intoxicity

Firefox has this functionality called Pocket, which (apparently randomly) suggests articles for you. When I fired up my long-neglected laptop this morning and launched the browser, it was ready with a link to an article called "Stop Giving Toxic People Your Time." Thinking (incorrectly) it was actually something I'd read before, I clicked over to it. I realized my error the moment I saw the datestamp, but what the heck, I had time.

I didn't make it far. Early in the piece the author notes:

We say that others make us feel that way. But that’s false. You decide how you feel about the things that happen in your life.

Events can’t harm us. Our perception of an event harms us. That’s one of the most important ideas of Stoic philosophy.

In other words, you decide what meaning you give to the things that happen in your life. If your friend tells lies about you behind your back, and you get upset, that’s because you decided to get upset.
I could hear the alarm bells. Not because I essentially disagree with what the author was saying. But because of how they were saying it.

I'm a firm believer in taking responsibility for one's feelings and emotions. And that means understanding that other people do not genuinely control them. And in that sense, the idea that other do not make us feel certain ways is entirely correct. If a friend tells someone else a damaging falsehood about me, my reaction to that is not their doing. It's mine, and I should own it, if for no other reason that avoiding self-talk that posits myself as being at the mercy of others helps me to feel more in control and able to manage my life. But I also realize that my perception of, and reaction to events is not a simple matter of conscious choice in the moment.

It's like seeing a dark shape looming up in front of you when you weren't expecting it. The perception of possible danger and the fight or flight response to it is a part of us, not the coat that we'd forgotten that we'd hung on the coat track last night. And because those things are part of us, we can control them. But just in the same way that we learned some or all of our response to that situation, and that learning took time, we have to learn different responses, and that learning also takes time. It may also never be 100% effective. A person can practice their powers of perception and observation all they want, but if they're color-blind, no amount of practice will change that. This is where an understanding of the concepts of a Fixed Mindset and a Growth Mindset are very useful. In the world of corporate buzzwords, they've become the new Goofus and Gallant, but in reality, both are useful - the trick is knowing which one applies in any given circumstance. And that's something that business types and self-help gurus alike can miss in their desire to boil things down to simple and understandable bites.

The article goes on to say:
The great Stoic philosopher, Epictetus, said this in his Manual For Living:

“Avoid fraternizing with people who don’t share your values. Prolonged association with those with false ideas can only tarnish your thinking.”

It’s something I truly live by. I’ve seen others destroy people’s lives too often to take this idea carelessly.

And I bet that you’ve had your share of, for lack of a better term, “toxic” people in your life.
And that's when I decided that I'd had enough.

Now, I'm not familiar with Epictetus - this was the first I'd heard of him. And so I don't really know what his philosophy is like. So I'm not going to pass judgement on it. I will take exception however, to the way it's been deployed here. I suspect that the apparent equivalence of "people who don't share your values" and "those with false ideas" is ephemeral in that a continued and deeper reading of Epictetus' Manual For Living would provide some way of understanding which ideas are false and which are not. But in the context of "Stop Giving Toxic People Your Time," a difference in values (and the author goes on to say "I think that less than 1% of the population has values.") becomes how one determines if someone else is toxic, and a danger.

But for me, if you're going to take those twenty-one words as something to live by, you have to understand that this: "Prolonged association with you and your false ideas can only tarnish their thinking," is just as likely to be true. In other words, if a lack of shared values equals toxicity, then that toxicity flows both ways.

Not trusting my immediate emotional response to be an accurate one, I read a little farther, but didn't see a nod to what I considered an important point - that our understanding of the world is subjective.

I'll freely admit to not having values. After all, I said it myself, people don't have values or principles, they have interests. The appearance of values is little more than understanding, articulating and pursuing our interests in ways that don't saddle others with the costs, if they haven't expressed a willingness to bear them. But it's worth bearing in mind that this isn't an objective statement. The reality of pursuing one's on interests without shifting the costs is not something that can be scientifically determined. It all depends on how we understand the world around us.

And that understanding of the world around us is not itself objective. I currently live in the suburbs of Seattle, but I grew up in the suburbs of Chicago. And "Seattle cold" and "Chicago cold" are not the same. So when the people I know who grew up here describe the weather as "cold," I tend to chuckle. But I also realize that this isn't a defect on their part; they don't have the same lived experience I do. Just as I have it on pretty good authority that "Chicago cold" and "Anchorage cold" are not the same. I have no idea how I would deal with "Anchorage cold." But seeing how people from Anchorage deal with "Seattle cold," I suspect that I'd be in for a kicking if I tried it unprepared.

My personal understanding of myself as a person without values has come from a desire to avoid making value judgements about how others live their lives. Which I get is a problem for some people. But that's their problem, and not mine. My interests tell me that I'm better off understanding things in terms of adaptive/maladaptive, legal/illegal or safe/risky than good/bad or value-driven/valueless. Because I like people, and defining them as "toxic" while avoiding the label for myself, flies in the face of that.

Friday, March 23, 2018

Is One Always the Loneliest Number?

So I was listening to NPR's Hidden Brain podcast on how American ideas of masculinity make men lonely, and one of the lonely men in question made an interesting point.

Host Shankar Vedantam sets up the point with this observation:

Paul's childhood offered a model of a close-knit community. His life in his 40s was nothing like that model. Looking back, Paul says he ignored the warning signs that his social world was shrinking.
Guest Paul Kugelman then finishes the point thus:
I really don't think I dealt with it, candidly. I think I viewed it as sort of a circumstance rather than a problem. It was kind of like gravity. But there was also a part of me that realized I was alone.
It was thought-provoking in the realization that I tend to look at life the same way. Being a grown man, and a single one at that, intimate friendships are off-limits. But, as Mr. Kugelman did, I view it as a circumstance, rather than a problem. It's simply something that one deals with and adapts to. Of course the thing about adaptation is that people do so with varying degrees of willingness and success.

Being an ineligible bachelor, and a Black person whose world is overwhelmingly White, I adapted to being alone with a certain level of enthusiasm. While Mr. Kugelman was somewhat unsatisfied with the fact that at the end of the day, it was just him, I've come to find that liberating. Not that I consider myself an introvert - I tend to prefer a certain level of social interaction, and take steps to maintain the same. But I relish having time when it's just me, and I don't have to wear the roles that I'm expected to around other people.

The point of the podcast, though, is that I shouldn't view the world this way, because it's bad to be alone. Because being alone comes with: Worse health outcomes. Heart disease. Greater stress. Accelerated age-related problems. And, occasionally, hugging inanimate objects. And, in the service of not being completely flippant about this, I suppose that I have to include suicide on the list, as well. And so it's something to be avoided. Something best thought of as a problem, rather than a sort of circumstance.

But what if the issue isn't being alone, but how well one adapts to it? Of course, I realize that I have a dog in this fight. After all, I come home to an empty apartment day after day. And while in my younger days, people telling me that it was more difficult to live a healthy life as a singleton would have been greeted with "Watch me," I'm now a lot less oppositional about these things. I still think that I'll manage just fine, thank you very much, but part of that has to be the idea that I'm okay with being alone, and because I like being single, I've more willingly adapted to that situation.

And so I'm curious about that. I'd done a quick Google search, to see if the topic was being studied, and I was somewhat surprised to see that most of the results dealt with how to recover from losing a spouse. There was one that dealt with the topic as a life choice, but it was more about being an introvert, rather than active adaptation to a life spent by oneself. I'm not a big internet researcher, so I'm unlikely to spend a lot of time trying to track this down, but it will be interesting to learn. Perhaps I'll get around to asking Mr. Vedantam.

Thursday, March 22, 2018

Tuesday, March 20, 2018

Lies, Damn Lies and Politics

We can go on for DAYS about the absolutely deplorable practice of telling falsehoods during a high-stakes political campaign. But, all of this misses one very important point: Why are people bothering to lie?

Yes, yes, I know - There are any number of people who'll say that we already know that. After all, the position of President of the United States of America is widely considered to be the single most powerful political office, if not the single most powerful office, period, on the face of the Earth. With stakes that high, who WOULDN'T be tempted to cheat, if only a little bit?

But there is one thing that can be said for everyone who lies for reasons other than pervasive mental pathology: They want, hope and/or expect to be believed. They rely on a belief that their audience wants to believe statements that are provably false, and then have them disseminated, acted upon, and backed with complete sincerity by people whose identities are bolstered by those falsehoods.

But, of course, that's not the whole picture. Can more than a few people honestly say they've never chosen to believe someone, when they really knew better, just because they really didn't want to believe they were lying? I don't know about you, but given the choice, I'd much rather that our elected officials were as stand-up as they act. The idea that someone's a liar, but that they somehow will never find it useful to lie to ME, creates too much cognitive dissonance for my brain to handle. And while I tend to therefore be consciously suspicious of people's honesty, that there have been at least a few times when I've decided that someone was being honest with me, despite the evidence, because it suited my purposes.

And shall we take a few moments to talk about the practice of ignoring a lie here and there, when it serves "the greater good?" How many times have you heard someone with no personal stake in what's going on say something like, "Well, yes, my side lied, but the stakes were so high that the ends justified any means." Or, the old standby: "The truth just doesn't work with some people. Our side had to lie to get these people to take the action that we needed them to take." Or how about: "Look, these people are criminals! Why should anyone bother being honest with them?"  And here's one of my personal favorites: "But the other guys lie! If we restrict ourselves to the truth, it puts us at a disadvantage. So we have to lie to level the playing field."

Yeah. You can just tell that we're a society that values honesty.

So, in the end, we encourage the very practice that we claim to so dislike. Through a number of factors, from pride, to expedience, to hope beyond hope, we find ourselves having to take everything with a grain of salt, and sift truth from fiction in situations where one would think that the stakes are too high for some idiot to decide, "Now's a good time for a mind game." We've allowed people to make cynicism into a survival trait, and then we wonder why some people view suspicion as a mark of intelligence.

Saturday, March 17, 2018

Undemocratized

Democracy (or, as in the case of the United States, a Republic), as a form of government, is a means, and not an end. For this reason, I find articles worrying about people's failing commitment to democracy tiresome. While I understand the overall tendency to link commitment to "democratic ideals," as they are often called, to broader measures of social enlightenment, the fact of the matter is that the two are not particularly closely related.

I have worked with children and managed adults, and one of the things that I've learned is that the primary difference between adults and children is often height. Children tend to be big boosters of democracy - when they think that they're going to win the vote, and thus be able to legitimately demand that things go their way, regardless of the objections of others. When they understand themselves to be in the minority, however, their enthusiasm for majoritarian rule (and the overriding of their own wishes) quickly fades. And while adults may have a substantially more nuanced view of the pros and cons of participatory decision making, that basic tension remains. It's not terribly difficult to find examples, on both the American Left and Right, of issues (effectively of their understanding of virtue) that people feel should be above the risk of being put to a popular vote.

To borrow (and slightly alter) the well-worn saying from Lord Palmerston: "People have no ongoing values or principles, they only have ongoing interests." And, as a result, they will tend to back those things that they understand align with their ongoing interests. When Yascha Mounk portrays Americans' "age-old fantasy of a benevolent dictator" as depressing and the "long-standing desire for a strongman [leader]" as something to be healed, he is casting that commitment to one's interests over he choice of a specific form of government as pathological. But the understanding that a commitment to democracy is the healthy choice is never supported, it is only assumed.

Conservative/Republican voters and/or Trump supporters who are in favor of President Trump's (personal) authoritarian leanings are simply backing what they understand at this time to be most likely to advance and sustain their interests and better their fortunes. Democracy, Republicanism or whatever you wish to call it may be wonderful - but in and of itself, it neither provides food to eat, clothing to wear nor shelter from the elements. And when one understands at least a sizable minority of the rest of the population to be willfully perverse "takers" who will happily "vote themselves a share" of one's "hard-earned" food, clothing, housing and other income and/or wealth, it's easy to decide that participatory government needs, at least, fewer authorized participants. It's also worth pointing out that if President Trump is followed by a Democratic president who appears (or can be made to appear) to have authoritarian leanings of their own, especially if that person (like President Trump) is elected due to the vagaries of the Electoral College system, that support for "democracy" will surge among the same voters who have less use for it now. And this is not to call them out as hypocritical, any more so than anyone else is. It simply acknowledges the fact that people are in this because of the benefits that they understand it brings them and the disadvantages it imposes on their perceived rivals. As far as I'm concerned, when people claim they have ongoing values, rather than ongoing interests, they are incorrect. (Intentionally or not does not matter in my view.) Their perceived ongoing values may align closely with their genuine ongoing interests, but the two are still separate things.

The idea that Democracy (and the commitment thereto) and Enlightenment are linked, and therefore any genuinely Enlightened people will express an unshakable preference to Democracy is based on flimsy reasoning at best and pure assumption at worst. And the result of this is that hand-wringing over a waning of that commitment obscures the notion that simply because one believes something is correct does not free them of the responsibility of proving its right to exist, let alone enjoy permanent favor. Likewise, the adoration accorded to the founders of the American Republic ignores the fact that they didn't secede away from the British Crown because they understood that, despite all of the advantages and benefits that monarchy brings, participatory government without hereditary leadership, even if it had disadvantages, was the correct moral choice. The American Revolution was a direct result of the Colonists commitments to their own interests over a commitment to what was viewed internationally as a legitimate government. It's not at all difficult to imagine a Loyalist at the time indulging in the same sort of hand-wringing over waning commitment to monarchy - even if such sentiments would have been dangerous to express in public.

(This is another side effect of the fact that early American history is typically taught in the very early grades of school. The shabby treatment that dissenters to the Revolution received, some of which would count today as atrocities, is completely ignored in the service of maintaining a G-rated patriotism.)

While Democracy (however one winds up defining and/or implementing it) is unlikely to ever simply go away anytime soon, the fact remains that in order for it to thrive, it must be recognized that it, like any other system of government, has a responsibility to the people who live under it and not the other way around. If democratic processes and institutions are not perceived to be best protectors and advancers of public interests, then the public will be drawn to those systems that seem better. They may change their minds later, but this is simply the nature of the beast. And it's worth keeping top of mind for that reason.

Wednesday, March 14, 2018

One-Sided

I was reading a discussion of whether or not a certain idea was anti-Semitic. And the general consensus was that it didn't start out that way, but it had been taken over by anti-Semites and so was now irretrievably tainted. But it wasn't taken by anti-Semites. It was gradually being ceded to them. As other people sought to avoid the taint, they reinforced the idea that only anti-Semites would use this idea. In this sense, it's like the swastika. Hindus, Buddhists and Jainists who attempt to use the swastika in its religious sense are treated as if they were wanna-be Nazis, because Western society at large can't be bothered to understand that the symbol has different meanings. And that's in part because it isn't a loss for Western society at large - it's a loss primarily for Indians and others from the subcontinent.

But when I've attempted to make this point - that if we continuously cede things to anti-Semites, sexists or other people we find deplorable, what will we do if they come for something that is important to us, but not to the society at large. Who will we rally to the cause that this should not be taken from us, if we are unwilling to help others keep what is theirs? And there is always this sense that no matter niche a group it is, that they're important enough that they're worth the broader society standing with them to prevent their symbols or language from being co-opted. But I suspect that this isn't as true as they believe it is.

Sunday, March 11, 2018

Unnuanced

There is always an impulse, from outside of a given constituency, to expect that political "leaders" (a term that I am very dubious of) will educate their constituents on the "correct" way to look at any given political topic, or at least publicly disavow their "incorrect" views on it. But this is a risky proposition for most elected politicians.

Despite a general mindset that views politicians as holding office as long as they wish to (within term limits) or at the pleasure of shadowy "masters" and therefore immune from public opinion, a politician who seeks to educate a constituency on something they believe to be untrue will be ousted in favor or someone whose rhetoric more closely matches people's perceived reality. Ronald Reagan and Donald Trump are both examples of this carrying people all the way to the White House, but even the campaign of Senator Bernie Sanders demonstrates this effect.

Policies that do not work for people in their everyday lives will be perceived as bad policies, if only because most people don't readily differentiate between "bad for me and my immediate interests" and "objectively and universally wrongheaded." And it's difficult to tell people that they are unskilled at making that distinction, because doing so challenges their sense of their intelligence and discretion. And to the degree that challenging the virtues of one's voters is seen as a career limiting move, it tend to be avoided.

Thursday, March 8, 2018

The Diversity Dividend

So this has popped up in my Google+ stream some time back, and I read it, and I think that while it's very well and passionately argued, it misses one thing, something that I've pointed out from time to time. It's weird, because for me it's in plain sight. But, as my father would tell me: "The definition of 'obvious' is something that is crystal-clear that you are the only person who sees it."

Think about it: A panel on diversity with no diversity on it. The outrage would be immediate, even from people of color. And yet maybe that is what should happen. And maybe the first question should be why do we need a black person on a panel to talk about inclusion when it’s the white person who needs to figure out how to include?
But I think that White people already know how to include, even if that sometimes means they define "inclusiveness" as having access to "ethnic" restaurants. What I suspect they don't know is why to include. For a long time in the country, White people managed perfectly well for themselves without giving a rip about Black people, Asians or Native Americans; and for a time, even Germans, Irish and Poles. Those last three groups (among others) weren't eventually integrated into what we now understand as "White," because it was good for them. It happened because it was good for the people who considered themselves the gatekeepers of "Whiteness."

The same is true today. We can say all we want that: "It’s for the white person to be less racist," and "It’s for the bigot to stop attacking trans people," but if those people don't understand what's in it for them, they're going to carry on with business as usual - because it works for them. And that's what they care about.

Diversity needs to stop being an obligation and become a product. Instead of being something that we saddle people with, we should make it into something that they want badly enough to pay something for. When someone sees the adoption of diversity as the shortest route between where they are, and where they want to be, they'll trample you in their haste to embrace it. People display, and identify, with racism, anti-Semitism, sexism, homophobia, transphobia, and xenophobia not because these things are understood as affirmative Goods, but because they understand that in a world where those hierarchies are in place, they do better for themselves than a world that's more diverse and inclusive. I'm a firm believer in teaching a man than he has everything he needs. But I also understand that it rarely works. If people are always going to see themselves as needy, perhaps the way to drive their support for the better world we want them to implement is to show them how they'll be less needy once they've done it.

As long as diversity is a pricey favor from the mainstream to the marginalized, its going to be slow going. When it becomes a favor from the mainstream to the mainstream that pays clear dividends, wild horses won't be able to drag them away.

Wednesday, March 7, 2018

The Wrongly Deluded

When I was in college, I was taking a sociology class - human relationships, I beleive it was, and one of the topics that came up was domestic abuse. There was an argument about culpability that basically broke down along gender lines - the men in the class tended to say that men learned to be abusive as they grew up, and the women tended towards an idea of simple moral failing. After about ten or fifteen minutes of hammering this out, the professor played a video for us that explained how men become abusers, and it generally backed the "learning" argument. At the end of the video the debate resumed, but with a slight difference - many of the women in class now conceded that men learned to be abusive, but said that they were to blame for having learned the behavior.

I was thinking about this after listening to an episode of Radio Atlantic, "John Wayne, Donald Trump, and the American Man." At one point (about 38 minutes in), Atlantic staff writer Megan Garber and editor-in-chief, Jeffrey Goldberg, are discussing the phenomenon of older men who view themselves as sexually desirable, and Ms. Garber makes the point that "women are physically attractive, men become attractive, through..." she never finishes the thought, but based on what had come just before, it's not difficult to imagine that she was going to say that "men become attractive through possessing wealth and power." Despite this, Mr. Goldberg then proceeds to describe the idea that a fifty-plus year old man might be attractive to a woman half their age as "delusional." And Ms. Garber is quick to agree with him. But what made it stand out for me is that Ms. Garber notes that being delusional is not exculpatory, and Mr. Goldberg quickly jumps in with, "Well, you're responsible for your own delusions."

There's something uncharitable in that, if for no other reason than it smacks of the concept, put forward by Thomas Aquinas that to mistake evil for good is to be guilty of morally culpable negligence. This seems to be born of the idea that people who do bad things must be punished, but not simply because what they have done is bad, but because they themselves are bad. To me, it speaks of a need to see certain people as being perverse, and, as such, different from the rest of us, and thus deserving of their fates.

Sarah Silverman illustrated this starkly for me during an interview with the radio program 1A when she spoke of Christian Picciolini, and notes that he said "Find someone who doesn't deserve your compassion, and give it to them." But even though she says the worst people in the world act the way they do because they want to feel that they are being heard and that they matter, she specifically exempts the wealthy and powerful, "the man, the oligarchs, the wealth addicts" and "the people misinforming others." But if those aren't the people who are undeserving of her compassion enough that she should give it to them, who are?

The idea that error is born of perversity, either willful or negligent is a powerful one, perhaps because it justifies the urge to punish; to treat others in ways that people often understand themselves as above being treated. But in doing so, there is a push to expand culpability, to prevent running afoul of understandings of fairness. An acquaintance asked why we seek to punish - why people couldn't settle for incapacitation or rehabilitation, neither of which require people to see others as evil or perverse. It's a worthy question. But I don't think that many of us would like the answer.

Sunday, March 4, 2018

Nothingness

But Black Panther is first and foremost an African American love letter, and as such it is consumed with The Void, the psychic and cultural wound caused by the Trans-Atlantic slave trade, the loss of life, culture, language, and history that could never be restored. It is the attempt to penetrate The Void that brought us Alex Haley’s Roots, that draws thousands of African Americans across the ocean to visit West Africa every year, that left me crumpled on the rocks outside the Door of No Return at Gorée Island’s slave house as I stared out over a horizon that my ancestors might have traversed once and forever. Because all they have was lost to The Void, I can never know who they were, and neither can anyone else.
Adam Serwer "The Tragedy of Erik Killmonger"
The Void. Really? Is that what they're call it these days? For all that, being a Black American myself, I understand the knowledge that history has lifted you out of sight of the roots of one's family tree, to call that distance The Void seems pretentious. It's simply a fact of history, and not one unique to the descendants of American slaves.

There's a part of me that wonders if this isn't a cultural issue - an artifact of the concept that we have in the United States (and other nations) that who you are is defined, to some or another degree, by who your ancestors were. And while I can see the tragedy in being unable to lay claim to an ancestor's greatness, there is also the freedom of being free of their crimes. For some people, seeking out elements of the past that make them great and others lesser comes across is nearly an obsession. There's a part of me that's happy to be impervious to that call. And I wonder if there are other people in the world, perhaps in other cultures, who are content to be who they are, without needing to reference earlier generations.

But I understand how it can feel isolating. I realize that some part of the Black population (among other people) of the world are distant relations, and I will never know. If I meet them on the street, they won't be long-lost family - they will simply be strangers. I'm okay with that, and perhaps that's why The Void, a term which conjures up ideas of an inky, frightening, nothingness, doesn't appeal to me. I am unafraid to be without a family history. But I understand that phenomena need names, and I don't have a better one at this point. So if it must be The Void, then so be it.

Saturday, March 3, 2018