Thursday, August 31, 2017

Gouged

In response to stories coming out of the Houston area alleging that certain businesses were engaging in price gouging, John Stossel's argument is a simple one: effectively that "greed is good." If store owners can charge whatever price they want, he says, they'll charge a price that incentivizes them to "risk life and limb restocking his store." If there's loads of profits to be made, he says, then entrepreneurs will rush to bring supplies to the area, and the added supply will lower prices.

And, do a certain degree, it makes sense.

It is, however, to some degree based on that old boogeyman, the unspoken assumption. The assumption that the store owner won't just take the extra profits, and then see to his own safety when supplies run out, ignoring the incentive. The assumption that entrepreneurs won't collude to keep prices high or withhold the items until the next disaster if they can't get the prices they want. The assumption that people who truly desperately need something will always be able to get it. The assumption that people who show up selling water will be selling clean, safe, water. In short, the unspoken assumption that much of "capitalism knows best" works under - that the sellers want or need to sell things as much as buyers want or need to buy them. Because where capitalism tends to fall down is where there is an inequality to be exploited.

In a system where people were perfectly informed, and perfectly rational, genuine price gouging would be impossible. Buyers could evaluate information on where supplies were located, judge how long they would take to arrive and then determine how much they needed to have available to them to survive that long. But in disasters, people are often not particularly well informed, and one can say that if they were eminently rational, they wouldn't have been there in the first place, especially when dealing with something like a hurricane, which doesn't just appear out of nowhere one afternoon, and areas that are likely to be flooded, which don't just sink overnight. And that's not even taking into account the emotionality of the event itself.

And there's a degree to which this is what debates over price gouging are actually about - people's ability to make dubious choices and not be punished for them when they go bad. Many cities are built in areas that have, generally speaking, something seriously risky about them. Because the very things that make them attractive often come with risks.

The primary problem with people trusting that capitalism is the best way to go about things is that it requires that they trust in each other. And, generally speaking, they don't trust one another. Price gouging works people up because it plays on their suspicions that (wealthier) people will turn on them the moment they are vulnerable - and they often view their vulnerability as something that is out of their control. If a house in a low-lying floodplain is the most affordable option, and someone buys it, they don't want to then feel that someone is waiting to pounce on them and squeeze them for what little they have when the gamble doesn't pay off, and they find themselves in need.

One of the underlying assumptions of "capitalism knows best" is that trust is valuable, and not to be squandered lightly. But again, that implies a more level playing field than most people actually perceive themselves to be on. If the local corner grocer goes from charging $4 a flat for water to $40 a flat unnecessarily, it's tempting to think that they're dooming their business. But if they're the only grocery within a reasonable distance, their captive market will continue to shop there. Opening a new grocery store is a non-trivial exercise, especially when the business model is effectively hoping that the incumbent has burned enough bridges to supply a reasonable customer base.

It's easy to berate people as "stupid" for not trusting in the invisible hand of the market to distribute goods and services "fairly." But if capitalism runs on trust, then people are going to be suspicious of it when their trust in other people is low. And no amount of calling people out on that is going to make them any more trusting.

Tuesday, August 29, 2017

Killing It

How's this for a headline: "'Psychologically scarred' millennials are killing countless industries from napkins to Applebee's — here are the businesses they like the least." Here's a better idea. Maybe certain industries, products and ways of doing business are simply dying because the times, they are a-changin'? The Business Insider article lists 19 things that Millennials, scars and all, are "killing" this week. I noticed that I don't do much with many of them myself.

Casual dining chains like Buffalo Wild Wings and Applebee's - An observation: People in my circles were referring to a certain restaurant chain as "Crapplebee's" back when my circles were still exclusively Gen-Xers. You didn't go to "casual dining chains" for the food. You went there for a gathering or whatnot at a place that served some semblance of food. I haven't set foot in a casual dining chain restaurant in over a decade. Mainly because my friends don't think of them as useful gathering places anymore, and I don't work near enough to one that its a place to hang out with coworkers after work.

Beer - I'm not a beer drinker because, let's face it, beer is an acquired taste, and I've never been interested in doing the work to acquire it. And again, many of the people I hang out with, most of whom are Gen-Xers like myself, aren't beer drinkers, either. I don't know why Millennials drink less beer than previous generations, but maybe it's because they didn't spend their late high-school and college years (the time when most people I knew were acquiring a taste for beer) thinking that beer was ambrosia or something. It might also have to do with their being less cachet to simply being drunk off one's butt, which seemed to be a lot of people's goal in life back when I was in school. And there's the one factor that never changes - the disdain that many people have for mass-market beers. And the "craft" beers that the mass market brewers bought up when they became trendy.

Napkins - The only time that I ever purchased napkins was when I was having a party, because when you need 200 of them, folding paper towels down to napkin size is just too time consuming. Millennials didn't invent the idea that paper towels were just all-around more useful than napkins (especially when something spills).

"Breastaurant" chains like Hooters - Now, I never really saw the point behind Hooters and other breastaurant (I can't believe that's actually a word) chains. Going to a place with "casual dining" level bad food for the privilege of having waitresses in revealing outfits pretend to flirt with you just never appealed to me. Apparently, it doesn't appeal to Millennials, either. Which makes sense, considering that they are, for the most part, the generation that Social Justice Warriors are drawn from. It's kind of strange to think that people who see catcalling as a serious affront would patronize a restaurant chain whose whole shtick is making the wait staff into sex objects. But it seems strange to think of Millennials as "killing" this segment. After all, I'm old enough to remember when these joints first hit the scene. The whole thing counts as a fad, if you ask me. And fads aren't exactly known for their staying power.

Cereal - Most breakfast cereals are aimed squarely at children, and are designed to be sweet. I have some samples of "Reece's Puffs" cereal. While the box proudly proclaims that the first ingredient is whole grains, when you look at the ingredient list, you quickly realize that this is because they've listed every sweetener individually. But when those sweeteners are taken together you wind up with something that's literally 1/3 sugar by weight. You may as well pour milk on Jolly Ranchers. And while this is an extreme example, most breakfast cereals, even the ones aimed a adults who want a healthier diet (like Special K and Cheerios) add sugar. Anyone who cares about their health and/or their teeth is advised to steer clear of the stuff, no matter how old you are.

Golf - I have played golf exactly once. It was as boring as watching paint dry. But it also seems like one of those things that becomes simply more or less popular over time. And for right now, it's simply not that popular. Likely because I can't imagine spending a few hours watching a golf stream on the web.

Motorcycles - I like motorcycles, but I've never owned one. It seems to me that one has to be a bit of a motorhead to really be into motorcycles, given that they're not quite as easy to handle as cars are, and if you live in any sort of built-up area, you have a decent number of transportation options. Buying a motorcycle just seems like it wouldn't be a priority under the circumstances. Especially if you live in an area where you can't ride year round. I think there are exactly five people in my extended social circles who ride motorcycles regularly - and one of them had a father who worked for Harley-Davidson.

Homeownership - If you live in most urban areas, houses are simply out of reach of a twenty-something's salary. And that really applies if you live someplace where the housing values have skyrocketed. Housing values, especially in desirable areas, have outpaced the rate of inflation like crazy. And with lending standards being rather tighter than they were before the Great Recession, it can easily take more than a year's salary for a decent down payment. And with the Global Pool of Money constantly looking for anything that will generate a steady return, homes are attractive investments for wealthy institutions, which keeps the prices up. There is a difference between killing a market, and being priced out of it.

Yogurt — especially light yogurt - This strikes me as another fad. Again, I remember when yogurt was the hot new thing. It doesn't strike me as at all strange that it didn't stay that way.

Bars of soap - I was never really one to worry about germs on my soap. After all, it doesn't make sense to use soap to kill the germs on your body, if the germs are just as happy when the soap is there. But things change. Liquid soaps (or as the trendy like to call them, body washes) have been around for a while, and considering how paranoid the last generation or so of parents has been about things, it makes sense that people who were worried that soap could harbor germs would move to a form factor that didn't put all of the soap in contact with your body.

Diamonds - You'd have to be living under a pretty big rock by this point to not know that diamond prices are artificially high, and that "big jewelry" is behind most of the "traditional" amounts that you're supposed to pay for these things. I'm not morally opposed to spending $10,000 on someone that I simply loved to death (not that I've ever been in love) but if I'm going to shell out for an expensive gift, it makes sense to buy something that's not subject to one of the most egregious examples of artificial scarcity in the history of the world. Besides, women don't need expensive tchotchkes to pawn if the wedding is called off.

Fabric Softener - This is something that I can take or leave. I like tossing something in the dryer that will tame the static electricity, but then again, I live in an apartment with some pretty serious carpeting. I could power a cell phone with my bare hands some days. But again, this isn't a necessity, it's a nice to have. And even beyond being able to same money by passing on it, you have to have a place to keep it. And as housing prices go up, homes become smaller.

Banks (Physical branches, really) - Most banks don't really need much in the way of the brick-and-mortar presence anymore. And, given the work that banks have put into making physical branches obsolete, it doesn't make sense to lay this at the feet of Millennials. The main reason I still go to my bank from time to time is that it on the way to and from work, and I have an investment that I inherited that still pays out in physical checks every month. But even with that, if I never wanted to actually go to the bank, I could set it up so that I don't have to.

Department stores like Macy's and Sears - Millennials or none, Amazon and online retailing in general is sounding the death knell for the old-school department store. The example I usually use is J. C. Penny - the last time I bought clothing from there, it was cheap, but it had a lifespan measured in weeks. And the last time I went clothes shopping at a department store I couldn't find a single one of the items I was looking for in my size. The salesperson helpfully suggested that I could order it online, and that takes us back to Amazon. Because it makes more sense for me to by things from an online retailer that I use already, than it does for me to open up yet another online account that I'm unlikely to use again for years at a time; in spite of all the spam that they'll send me.

Designer handbags - Has it occurred to anyone that maybe Millennial women have finally clued to the fact that you can score a well-made handbag without spending an arm and a leg? It's just like anything else. I wouldn't spend $300 or more on a designer laptop bag, when any relatively decent manufacturer can make one for a lot less.

Gyms - So business insider says that Millennials are "ditching gyms in favor of boutique, class-centric centers." In other words, rather than pay to figure it out themselves in a giant room full of other random people, they're paying to have someone instruct them on what to do. This seems like simply another shift in the marketplace, given the number of people I know who have (or are) personal trainers.

Home-improvement stores like Home Depot and Lowe's - Huh. People who can't afford to by their own homes don't go to home-improvement stores. Never would have guessed. When I go to home improvement stores, its for things like lamps. mainly because the selection is decent, and it's closer than the nearest department store.

Football (Football on television, really) - One of my big gripes about cable television, before I dropped it, was paying big bucks for sports channels that I didn't watch, because they only rarely played the sports I was interested in - like English Premier League Football/Soccer. But here again, the issue isn't Millennials. It's football. A lot of American sports have adapted themselves specifically to television. And so when television's dominance began to slip, the sports that had tied themselves to it also started to slip.

Oil - What they meant by this is that younger people don't see the industry as viable, in the long term, and therefore aren't rushing to work in it. Which makes sense, really. Everyone has known for some time hat eventually, we'd run out of oil, and as the consequences of fossil-carbon-based fuels become more noticeable, people are shying away from it. Sure, the oil industry is likely yo die slowly, rather than quickly, and in this sense, it makes little sense to say that Millennials are "killing" it, because it will likely survive them, but that ship is slowly foundering, and so it seems reasonable to not want to get on it now.

In the end, what strikes me about the constant "Millennials are killing things" yammering is that it predicates a failure of industries to adapt to changes in society, and social tastes, as the doing of a generational cohort, rather than the industries themselves, which is the way we always thought of it before the World Wide Web came along, and everyone was competing for the catchiest (if not necessarily the most honest) headline. I mean, no one ever accused us of "killing" the typewriter, even though we pretty much abandoned them in droves for computers and printers. And our shift to corresponding by e-mail didn't earn us credit for "killing" the postal service, which we all understand is a shadow of its former self.

Maybe that's something else we can hope that Millennials will kill - the "[Random age cohort] is 'killing' things" headline.

Monday, August 28, 2017

The End of Summer

A stiff breeze in the Puget Sound area is a rare thing. So when one appears, the kite surfers, sail boarders and just plain sailors all head out to the water. And now that Summer is winding down, the sunny days are even more treasured.

Friday, August 25, 2017

Big Bad Wolf

"We Need To Start Befriending Neo Nazis." It's a simple enough idea. Talking with the Alt-Right is the best way to deal with the Alt-Right. One of the things that drives people to fringe and/or radical movements, of any political ideology, is the idea that they are marginalized and forgotten. In effect, they are cut off from connection with the greater society, and from the many and varied benefits that such connection brings.

And it's worth pointing out that this is known and understood about more than out-there political groups. A lack of social connection is one of the major risk factors for drug abuse; which makes it all the more heartbreaking when the people closest to the addict use their connections, their relationships, as hostages to dictate a change in the addicts behavior.

But in any event, a cam across a discussion of the Forward article, and I noticed very quickly the current of fear that ran through the comments.

I don't know where I first found this picture, but it resonated with me from the outset.
While it's easy, and common, to make the people we don't like out to be monsters, that caricature of them makes them seem more dangerous and more fearsome than they would if we viewed them as people.

While what happened in Charlottesville was unmistakably a tragedy, it was one person acting violently. But from that, there is this sense, and you see it in a number of places, that violence is somehow endemic to even being in the vicinity of an Alt-Right protest, or a counter-protest.
Coliseum College Prep Academy teacher Chela Delgado: In Charlottesville, a number of folks were trying to protest nonviolently, and then violence occurred. So I think that if you're choosing to not participate, that also makes a lot of sense.
In San Francisco, Local Teens Consider Protesting Right-Wing Marches
Back in 2003, an 86 year old man drove through a crowd of people in Santa Monica, likely due to age and confusion. 10 people were killed and nearly more than 60 injured. And while the numbers between these two incidents are more skewed, the fact of the matter is that traffic accidents kill more people than violence does in the United States.

Yet, being a counter-protester at an Alt-Right rally, or simply sitting down to a conversation with a "neo-Nazi" who seems receptive to conversation, can be seen as unacceptably dangerous.

In the end, the Alt-Right isn't a group of homicidal maniacs. If they were, they could be doing a lot of damage already. After all, if your primary targets are non-white people, it's not like they're all that difficult to find. Sure, they're fairly thin on the ground in some parts of the country, but a decent road trip or train ride can put you into an area where they're common enough to go after.

But the death of Heather Heyer has sown fear, and that fear makes the whole issue seem more dangerous than it is. This is not to say that there isn't any potential for violence. After all, there are people on both sides spoiling for a fight; or looking to goad the other side into starting one. But the Alt-Right situation isn't as dangerous as people can make it out to be. After all, these are people who realize that they aren't a majority of the population, or even a sizable minority. While they may want to return to a time when they could use violence to get their way, and not have to worry about the consequences thereof, that time is not now. If they turned to violence at the drop of a hat, they'd quickly find themselves hunted across the country.

Fear is an adaptive response. It helps keep one alive in uncertain and dangerous situations. But it can also run away with people. There is something to be said for being a good judge of character, and understanding when a situation may go sideways. But there's also something to be said for understanding the genuine risks that one faces.

Tuesday, August 22, 2017

I Don't Hear You

So I was reading a post about dealing with issues of race, and the poster ended it thusly:

"White people need to talk to other white people about this shit if it's ever going to change."

And I wonder if that will actually change anything. Because perhaps part of the problem is that we live in a society where you can actually be that selective about who you listen to.

When I was growing up, I was taught that not listening to White people was dangerous. Not in the sense that I always had to be attentive to what they might ask of me in a "do as they say" manner, but in the sense that I couldn't simply ignore them and go about my business, because I would miss the subtle cues that indicated that they might be moving against me somehow. And it lead me, for a long time, to be more comfortable with overt expressions of racism. "I'd rather be shot in the chest," I would tell people, "than stabbed in the back."

But as I grew older, I learned that I could ignore them. That in 99.99% of cases, I didn't have to care what nearly anyone, whether White, Black, Brown or Purple, thought of me. And so, I didn't. I decided that what other people thought of me was none of my business, unless they were offering something useful. And so I come out of job interviews as hopped up on adrenaline as if I'd just faced down an angry bear, but can't be bothered to know who now lives next door to me.

It's kind of liberating, really. And I think that's one of the nice things about being in the majority in one way or another. You don't have to ever listen to people you don't care to, because their opinions of you just don't matter. Because they don't have the social capital to back them up, and if they resort to violence, it does as much harm as good.

And so White people have to talk to White people, because Black people don't have the social capital to require that a White person listen to them. Men have to talk to men, because women can safely be ignored. And straight people have to talk to straight people, because queer people can be simply blown off. Und so weiter, und so weiter, as we used to put it in German class. (Somehow, it sounds more... resigned than saying it in English.)

Humanity doesn't scale well, a wise friend of mine told me, and I think that this is why. If we can band together with enough like-minded people, what other people think simply becomes immaterial. And even existential threats have a hard time changing that, unless they last for generations.

"White people need to talk to other white people about this shit if it's ever going to change," wraps the hopes and fears for the future into one package, tied with a bow. It recognizes that change is possible, but concedes that it isn't necessary. And I think that's the fear that the Others have learned. That in some sense, they're unimportant. When Richard Spencer talks about the idea of White people separating themselves from non-Whites, there is a worry that it will work. A worry that in walling themselves off from everyone they don't want to deal with, that they're not risking losing something important.

The fight over Confederate statuary strikes me as being a manifestation of this conflict. Because it boils it down to a zero sum game. Are society and the powers that be going to listen to us or to them? Because it's not just about being listened to - it's also about other people not being listened to because there's only so much attention to go around. And this puts those not listened to into crisis, because the nature of our society and economy can mean being ignored, being forgotten, being invisible can be a fate worse than death. When it isn't simply a miserable way to die.

But we all need the ability to ignore. We all need the ability to determine who we are going to attend to, and who we are going to put off, either until later or forever. Otherwise, we'd never have any rest or peace. But we don't make that choice randomly. And therein, lies the rub.

Monday, August 21, 2017

Miscalculated

I was listening to the news on my way home from work, when a story came on about how a computer algorithm designed to reduce the reliance on cash bail to determine who stays in jail while awaiting trial and who goes free.

This past July, one Edward French was murdered in San Francisco. One of the suspects is a young man who was on felony probation for breaking into cars, gun possession and parole violations. The judge who released him used the suspect's "public safety assessment" in her determination to release him into an "assertive case management" pretrial program. The reporter's assessment is harsh: "In the case of Ed French, the algorithm failed."

But implicit in that assessment is an idea that is never actually asserted within the story itself. That the role of the public safety assessment is to determine, as a yes or no question, whether a suspect will return to court for their trial without attempting to flee or committing another crime.

The [Arnold] foundation [, which created the tool] says the algorithm generates gender-and-race-neutral "evidence-based data" on which defendants should be released before trial offering judge's "reliable, predictive information about the risk that a defendant released before trial will engage in violence, commit a new crime, or fail to return to court."

In the case of French, a miscalculation ended in murder.
Did A Bail Reform Algorithm Contribute To This San Francisco Man's Murder?
And here again, is that implication that the job of the algorithm is to rate people as "safe" or "unsafe," and that the young suspect in Mr. French's murder was mistakenly, and disastrously, rated "safe." But that's not the way a risk assessment algorithm works, especially not one that outputs a score. The Arnold Foundation's tool takes nine factors and uses them to effectively express, as a score, a probability that someone will flee the jurisdiction or commit another crime. The only way in which this could be a determining score, effectively offering a "safe" or "unsafe" answer, is if those factors basically created a hard line, on one side of which the flight and recidivism rate was effectively zero, and on the other side, it was effectively 100%. But the world isn't that cut and dried. The lowest-risk category could be a .5% change of a suspect fleeing or committing a crime while out of jail.

Which of course, is cold comfort to the partners, friends and relatives of the slain, who see with perfect hindsight, every little reason why the one-in-two hundredth person, the person who killed their loved one, should have remained in jail.

And that's the problem with the bail bond system. The expectation that it's a tool to create 100% safety, and that anything less is a failure. And this an expectation that news outlets feed into. Even National Public Radio, which ran a series on why the cash bail system is simply a trap for the poor and a low-risk cash cow for the bail bonds industry, can't see its way clear to an angle other than the human interest of a life snuffed out, the suffering and anger of those left behind - and the fear that an algorithm intended to look past considerations such as race and class might convince a judge to let someone out of jail who goes on to kill someone who seems undeserving of that fate, because of a "miscalculation."

Thursday, August 17, 2017

Then and Now

I am not a student of American history. I took the classes that I needed to in school, and here and there I've read some nonfiction about this or that period, but I don't seriously study the history of the United States. There are only so many hours in the day, and they're often occupied by other things - like writing this blog, for instance. And so my understanding of things tends to be shallow, the sorts of things that either everyone knows, or are easily picked up by paying attention to everyday sources.

A few years back, I was in an online debate with a man who claimed that the Presidency of Barack Obama marked the end of the American commitment to the values of Life, Liberty and the Pursuit of Happiness. Falling back on my limited grasp of American history, I offered up the examples of the attempted extermination of the American Indian, the internment of the Japanese and the Chinese Exclusion Act as historical examples of American failures to live up to those ideals. And I stressed that these were things that happened in the past, and that in moving past them, the United States showed that as time went on, it grew more committed to those ideals, and that nothing in the policies of President Obama could be seen as serious moves back to that past.

And this is my standard pattern with such historical events. I tend to leave out the trans-Atlantic slave trade, Jim Crow and the like. Mainly because I think that pretty much everyone already knows about them. But also because I don't want to focus on the negative aspects of "my own" history. I see no reason to constantly harp on what happened to "us," as it were. There was enough of that when I was growing up, and a side effect of it was a lack of empathy towards others and a certain need to always win at Misery Poker. And here's the thing about Misery Poker - you have to be miserable to win. And somewhere along the line it occurred to me that I didn't want to always be miserable, just so I collect whatever dubious prizes that Misery Poker offered.

The other day, an acquaintance of mine was holding forth about how terrible it was that someone dared say that the treatment of the Irish in days past (the 1840s being perhaps a good example) was worse than the treatment of Black people today. My acquaintance, and a number of their online friends, didn't even bother with laying out the hands of Misery Poker, but rather decried how anyone could say that anyone but the Black community would win. But being an indifferent student of American history, it seems to me that perhaps, given the choice to be Black today or Irish in 1845, I'd choose to be Black today. (I am, after all, rather enamored of automobiles, air conditioning and the Internet.)

Because here's the thing: The fact that it may have been worse to be nominally White at some distant point in the past does not mean that it is perfect to be Black today. And the commenter hadn't made the point that being Irish in the past was worse than to be Black at that same time - merely that the Black population of now is better off than the Irish population of then. And in so doing was pointing out the very thing that I had done some years back - noting the growing commitment of the United States to its ideals of Life, Liberty and the Pursuit of Happiness.

Now, to be sure, I'm not a naïf. I understand, given the context of current events that the commenter likely meant to minimize the situation of today's Black Americans in the service of painting us as undeserving of the accommodations granted to us. But be that as it may, it still may be true that the modern United States treats its marginalized better than the pre-Civil War United States treated those much closer to the mainstream, but still unfortunate enough to not be within it. A truth put to an unjust end is still true, and should be judged on its merits, not those of the speaker.

Wednesday, August 16, 2017

Rationality

Among the definitions of "rationalize" put forth by the Merriam-Webster online dictionary are: "to attribute (one's actions) to rational and creditable motives without analysis of true and especially unconscious motives" and "to provide plausible but untrue reasons for conduct." And this fits in with the way Americans normally speak about rationalizing something. In common usage to "rationalize" is seen as a variety of deceptive self-justification; a way of taking an action that one understands to be wrong and filing off the rough edges until it fits (even if this takes some effort) into the slot labelled "Right."

Overall, the general understanding is that rationalization is a bad thing, leading, as it does, to people convincing themselves that wrong is right; accordingly cautions against it are not difficult to find.

[Art] Caplan draws a wise lesson from the Nazi doctors: Beware the human weakness for moral rationalization. But part of that weakness is the illusion in each of us that we have escaped it.
William Saletan ("Natural-Born Killers" Slate Magazine, 4 April, 2005)
But part of the knock on rationalization is the assumption that people are capable of creating workable moral and ethical frameworks that are 100% objective constructs and need have no recourse to a person's understanding of what it should look like. Given that people tend to resist purely mathematical expressions of morality and ethics, this strikes me as a standard that few, if any, will reach.
Now, normative cultural relativism might sound pretty good to you; it does at first to a lot of people. Because it seems like it’s all about inclusiveness and tolerance. Who am I to tell other cultures how they should live, right? But this view actually has some pretty big flaws.

If every culture is the sole arbiter of what’s right for it, that means no culture can actually be wrong. It means Nazi culture actually was right, for the people living in that culture. A dissenting German voice in, say, 1940, would have just been wrong, if it had claimed that Jewish people deserved to be treated the same as other Germans.
Hank Green "Metaethics: Crash Course Philosophy #32"
Mr. Green's statement that normative cultural relativism "has some pretty big flaws" due to the fact that it allows Nazi culture to right, at least as far as the Nazis themselves were concerned at the time, could also be understood to be a form of rationalization. After all, Mr. Green was teaching a course on Philosophy, and not is own personal understanding of right and wrong. Yet he allowed himself to challenge the concept of normative cultural relativism, and by extension, all of the Moral Antirealist stances, not based on the some inherent contradiction that he noted in them, but because they didn't automatically condemn the Nazis as wrong, and in doing so, he allowed himself to work backwards from the desired endpoint of Nazi wrongness to a moral viewpoint; selecting one for himself that agreed with the preconceived notion that some acts are inarguably wrong.

And I'm okay with that, leaving aside the fact that it's poor form as teaching. Because for many people, this is how moral and ethical viewpoints are reached. People understand what they want to be inside and outside of the bounds of acceptable behavior, and they select a moral viewpoint that comports with that. And it's possibly this habit that Mr. Green had in mind (whether he was aware of it or not is a different story) when he confidently told his audience "most people you know – including yourself – are committed to some form of moral realism."

The idea that some items are morally wrong simply as a matter of moral fact can be a comforting and useful one. But there may be some value in the idea of an individual actually deciding to elevate themselves to the post of moral arbiter, rather than outsourcing that to nature, a society or an institution.

In this cartoon, the left-hand speaker, acting as an author avatar, decides that the simple act of holding a flag from Nazi Germany should be considered a form of incitement, and thus ineligible for free speech protections, based on little other than their own moral intuitions. The person holding the flag has no dialog within the graphic: they never speak and we, as the audience, are never let into their thoughts. Likewise, we never see the flag acting as an incitement, there is no one to respond to it. The two interlocutors who observe it, however, are clearly not at all inspired to recreate the Third Reich - outside of wanting to punch the flag carrier in the face and strip them of free-speech rights, they are not moved to action. In the end, the character with the flag may as well simply be a statue or a projection; they're static and show no impact on the world around them. We are meant to infer the evidence against them from real-world events, which are themselves not referenced.

Because of this, the incitement argument strikes me as a rationalization, and that rationalization is born of the fact that effectively saying "I find would-be Nazis so awful that I think the rules shouldn't apply to them," is somehow off-limits, despite the fact that loudly proclaiming that one would simply punch them in the face is often considered acceptable. The number of people in my social media circles who have done so is substantial. Rather than putting time and energy into a mental gymnastics geared towards finding would-be Nazis to always be guilty of incitement so that acts against them may be reasonably judged self-defense, it seems more constructive to simply declare them persona non grata for being deplored, and when challenged, simply trust in the accuracy of the moral sentiment in question.

Because that appears to be what's actually going on. And its what always goes on. In part because American society always manages to convince itself that it should be above such things. But that's not how people work. I suspect it's better to be okay with that than it is to hide it under a pile of unnecessary rationalizations. To the degree that people trust their moral intuitions, let them trust them. We'll have a more honest society for it.

Sunday, August 13, 2017

Picking Sides

Evan McMullin, a former CIA officer who ran as an independent against Trump in 2016, had among the strongest condemnations of Trump’s statement of politicians on Twitter, saying Trump’s vagueness about who is to blame signals “positively to the white supremacists whose support he enjoys.”

Trump has been heavily criticized in the past for not doing more to condemn the hate groups that support him, including [former Grand Wizard David] Duke and the Ku Klux Klan, which endorsed him during the campaign in 2016. And his presidential campaign was bolstered by the resurgence of the so-called alt-right and characters like white-nationalist Spencer.

Indeed, Duke later responded to Trump’s statement on Twitter, telling him, “I would recommend you take a good look in the mirror & remember it was White Americans who put you in the presidency, not radical leftists.”
The Hidden Meaning of Trump’s Charlottesville Remarks
The older I become, and the more I come to understand how politics works, the more sympathy I have for politicians. Donald Trump has already admitted to finding the job itself more difficult than he expected it would be. I suspect that he has also learned (even if he has been less forthcoming about it) that the Faustian bargains that one makes in campaigning come with more difficulties than he’d initially come to believe.

As we move farther from the actual election, the fact that President Trump won the Electoral College but lost the overall popular vote becomes less salient, except perhaps to President Trump himself. But as a continuing political matter, it’s still front and center, and not simply because the needling of his Ego prompts the President to look for ways to re-litigate the election via repeated accusations of comically-mistargeted “voter fraud.” White supremacists don’t have to be a particularly large segment of the population to have been the coalition partners who put the President over-the-top in one or more of the states he carried. And while it’s entirely possible that had they all stayed home, Candidate Trump would have still carried the day, David Duke seems to think otherwise; and the President just might agree with him. In which case, he may not enjoy their support at all, but he needs it. Coming out and laying the blame for the events of Charlottesville, Virginia squarely at the feet of White hate groups, or even simply publicly labelling them as hate groups, may mollify some of the President’s critics for a time, but runs the risk of alienating the people who form the spine of much of whatever leverage that the President has left at this point.

Given that President Trump has embarked upon a policy of being the President mainly for people who will directly support him, his fortunes are, at least in the short term, tied to the strength of that support. Accordingly, there’s little benefit for him to undermine that support by agreeing with not only his critics, but his supporters’ critics.

Friday, August 11, 2017

Tolerably Intolerant

With the induction of "intolerant" into the American arsenal of low-grade pejoratives, the term is bandied about with a fair amount of regularity, having settled into an ironic definition of "a person or group of people, who due to their unjustified closed-mindedness, can safely/should be ignored." And when another random argument/shouting match about who qualifies as genuinely intolerant pops up, I'm reminded of this David Horsey cartoon from his days in Seattle.

Political/social arguments about tolerance tend to seem like challenges to a game of Russian Roulette, with each side claiming that the other is intolerant for not cheerfully accepting ideas that can be generally considered as directly aimed at undermining their worldviews, legitimacy and leadership. It's worth noting that this isn't always intentional bad faith. Political rhetoric can be remarkably layered and nuanced, and to those who aren't interested in peeling all of the layers of the onion, the fact that a topic is apparently off-limits may seem arbitrary, rather than serving a purpose. Bad faith abounds in politics and society, however, and so is the basis of many a sneering critique of the opposition's tolerance.

Generally speaking, in American politics, the Left holds to tolerance, which may perhaps be described as a mix of social, political and religious laissez-faire ("If it harms none, do as you will."), as an affirmative virtue. It comes across less valued on the American Right, except as a defense against the pejorative description of the right as intolerant. And what appears to drive many arguments about tolerance is a basic disconnect between what the sides themselves understand as harmful, and what the other side is willing to understand is harmful.

To use Mr. Horsey's cartoon as our example again, a stereotype of the American Right is that they find abortion, alternative sexuality, non-Christianity and Socialism to all be active harms of one sort or another. And while they may concede that these things exist and are unlikely to go away, the stereotypical Right-leaving echo chamber holds that support for these marks a person as perverse to one degree or another, and that if allowed free reign, their agenda will eventually erode the foundations of civil society. Likewise, a stereotype of the American Left is that they find homophobia, militarism and religious fundamentalism/zealotry to all be active harms in their own right. And therefore, the stereotypical Left-leaning echo chambers holds that these items are the marks of a perverse and harmful agenda.

Of course, since, in the end, each seeks to supplant the other, each stereotype sees its own viewpoints through a lens that carries very stringent and narrow understandings of "harm," and doesn't allow for simply undermining the other side to count. If it's not making the streets run red with blood, then it does no real harm to anyone. By the same token, as each stereotype sees itself as a affirmative good for society, work and ideas designed to undermine it, do come off as harmful. The stereotype of the American Right sees religious pluralism as a path to a world lacking in any moral constraints; while the Stereotype of the American Right sees the hegemony of one faith as a prelude to the end of free thought.

It's unlikely that either side will refrain from casting tolerance as the other showing a lack of commitment to what they consider "Truth," anytime soon. Nor will they concede the risk that they demand the other take. Because these things are all part of the script now. And more than anything else, the Stereotypes always stick to the script.

Wednesday, August 9, 2017

Time Marches Slowly

So the firing of Google engineer James Damore for his lengthy treatise challenging Google's diversity programs and the echo-chamber/code of silence that's built up around them has been all over the place this week. And in one (non-public) social media post I was reading on the topic, one of Damore's defenders claimed that "since the seventies" we'd effectively traded one form of discrimination for another, and asked when would it stop?

So - in the 1970s, my mother applied for a teaching position in the town I grew up in - a distant suburb of Chicago. She was rejected, because at this point in the early 1970s, they weren't hiring Black teachers. I hadn't quite started school at this point. I didn't have any Black teachers until college. Because what were the schools going to do? Fire some of the teachers they already had to make room for the people they hadn't been hiring previously? Unlikely.

But it's interesting. Because many the kids I went to school with looked at the world in the same way, even if it was over a much shorter timeframe. When I was a freshman in high school, the senior class were mostly of an age where they were born in 1964. As in the Civil Rights Act of 1964. The number of my classmates who felt that in less than 20 years, all vestiges of racial and ethnic discrimination in the United States has been wiped away was astonishing. And this idea, that things outside of our own living memory don't matter, takes some maturity to get rid of.

Because would you expect that an organization that started off with no non-White employees would now have an employee makeup that matches the community around it without some continuing measures? Especially given that numerical quotas were not allowed? A new teacher who started working in the public schools of my hometown when I was in first grade could conceivably still be working there. I know that some of my high-school instructors are still with the school (although they've moved up the hierarchy in the meantime). When would you think the last school administrator who was an active participant in "no Black teachers" finally left the district? 1975? 1985? 1995? And let's not forget the last teachers hired under that rule. And in that timeframe, how many people do you think they influenced? There is an idea that in order to harbor negative stereotypes about one or another group of people, you have to be a snarling bigot, with a pointy hood folded up in the bottom dresser drawer. And that to be influenced by such people, one has to live in a place that modernization took a pass on. It's a convenient stereotype, but not an accurate one. There is a tendency to see people as immune from being influenced by bad ideas unless they somehow show themselves to be monsters. And to the degree that the monsters are viewed as relics of the past, the bed ideas are often assumed to have died with them. Even though it's understood, for example, that there are still people who believe that the Earth is flat... And in my own life, I have been much more likely to encounter people willing to share (and attempt to sell me on) the idea that people's life outcomes are shaped, wholly or mostly, by nature than the idea that the South Pole is a hoax.

In the end, the problem with diversity may be that it's not attempting to solve the right problem, because it takes, as a starting point, the intractability of scarcity. If you view various "isms" as responses to scarcity that take on lives of their own after being allowed to take root deeply enough, then it seems that the best way to combat them is to do something about the scarcity. And that's a difficult thing to tackle because so much of our culture is designed to run on scarcity.

Sunday, August 6, 2017

Mislabelling

But Mark Zandi, Moody's chief economist, who has advised John McCain and donated to McCain and Hillary Clinton's campaigns, told Politico the plan is a "mistake" that will cause the labor force to come to a "standstill" in the next decade. "It is hard to imagine a policy that would do more damage to long-term economic growth," he said.

As NPR's Brian Naylor noted, economists believe the country's low unemployment rate (4.4 percent) coupled with retiring baby boomers will result in a labor shortage in the coming years.
FACT CHECK: Have Immigrants Lowered Wages For Blue-Collar American Workers?
And if you're an American worker, high enough demand for labor that employers have to compete with higher wages and benefits, you may be selective about which jobs to take and people who have been traditionally locked out of the labor market now have an in, is bad exactly how? After all, the internet bubble caused a serious labor shortage until a) companies started failing and b) internet infrastructure became robust enough that overseas remote work became widely viable. I don't remember that as being damaging to long-term growth. Not to say that it wasn't, because it could have been. But I have yet to hear an explanation of "The Great Recession" that says that the ultra-low unemployment rate (and high participation rate) is the culprit.

There's a saying that says that many people don't understand how something works by reading about it in its idealized form in a book somewhere - they learn about by watching it in practice on a day-to-day basis. And this sort of thinking, that what's good for people who work is bad for "the economy" as a whole, appears to be coloring many people's understanding of the purpose and goals of Capitalism. Personally, I would tend to call this Capital Primacy instead - the idea that whenever there is a conflict between the interests of Capital and Labor, that Capital must always be priviledged. And therefore a situation like a labor shortage, which makes Labor more dear than Capital, and therefore able to command a higher share of the resulting profits is bad, because what's bad for Capital is always bad for everyone. And while this doesn't result in a wholesale failure to consider the problems of Labor, no solution that results in greater benefits for Labor than Capital will receive serious consideration.

And I think that, despite the idea that the Average American isn't very sophisticated about economic topics, there is something of a realization that Capital must always benefit, and this is labelled "Capitalism." And therefore people come to perceive Capitalism as a system that is stacked against them. And if they're not going to benefit from that being the ways things are done, why support it? Why back a system in which the best-case-scenario is that a general status quo is maintained? Why back a system in which making up any of the ground lost to the people at the top of the pyramid is considered damaging, and not seriously questioned. NPR is largely considered to be left of center, yet Mr. Zandi's comment is allowed to go completely unchallenged. And this isn't to say that Mr. Zandi is incorrect. It may be precisely that making it more difficult to grow the economy over the long term. But that's something that deserves some level of explanation, rather than simply be asserted as truth and allowed to stand.

Perhaps the most corrosive piece of what we commonly term "élitism" is that the non-élites don't know their own best interests and/or the value of short-term pain for longer-term again. And therefore, there's no value in attempting to bring them on board with a plan that impacts them. I'm going to digress for a moment here, and blame a lot of this in the common perception of King George. I know, I know, bear with me for a moment. Because the American Revolution is commonly taught in grade-school social studies/history, it tends to come off as something of a morality play, with clear cut good guys and bad guys. And since King George is the Bad Guy in Chief, he's seldom given rational reasons for not allowing the American colonies to have seats in Parliament. And because terms like "tyrant" and "dictator" have exclusively negative connotations in modern parlance, people who understand that they know better than everyone don't see themselves in that sense, since King George's crime became one of being an abusive ruler, rather than one of shutting the colonies out of the legislative process. And to the degree that "élites" (which has become a pejorative itself) see themselves as benevolent parents; the "adults in the room," as it were. And just like one doesn't let the kids vote on what's for dinner until they don't say "cake" or "ice cream" anymore, "élites" (or "élitists") may feel that the citizenry is too childish (and I've seen it put just that way) to participate in their own governance.

Of course, the citizenry may become resentful of that, in the same way that anyone might become resentful of being spoken down to. The whole point behind a representative republic is to allow people to participate (albeit indirectly) in making decisions the outcomes of which they're going to have to live with. A presentation of Capitalism that seems to be stacked against them by design is going to lead to people choosing to back something else instead; something that they feel is more "fair" to them. They might wind up throwing the baby out with the bathwater, but the way to prevent that is to be more precise in the use of language, and to explain the counterintuitive, rather than simply state it.

Different Understandings

Abigail Cooke: There's some evidence that native-born workers who don't finish high school do compete with low-skill immigrants for the same jobs - in some cases. And when that happens, some of those people lose their jobs, their wages don't go up over time if they're having to switch jobs. But the size of this effect is really small. And one of the other things that happens, at least as often, is that those native-born workers are prompted to find new jobs doing maybe slightly different things. And often, those end up being slightly higher-skill jobs that come with higher wages.

Stacey Vanek Smith: Oh, like what?

Abigail Cooke: Back of the house versus front of the house in restaurant service, even sometimes, you know, bumping up to a sort of lower management level, or something where you're required to have a bit stronger English skills.
Does Data Back Trump Administration Plan To Cut Legal Immigration In Half?
I'm not sure that this is the sum total of the effect in play here. Ms. Cooke implies that the very same workers who are displayed from low-skill jobs find themselves in higher-paying, higher-skill jobs, and elsewhere in the piece implies that those new jobs exist partially because of the influx of immigrant consumers. But then at the end of the piece, she makes this point:
But I also think it's easier to point the finger at people who seem different than it is to really think about how the structures of our economy are really not helping a lot of low-skilled, low-wage people in this country. So I think there's a little bit of scapegoating going on.
So... the presence of immigrant workers in the United States does nothing to most workers, and for every worker for whom there's a direct negative or neutral impact, there's a another worker for whom there's a direct positive impact. But the economy isn't helping low-wage workers, and so they're taking it out on the very immigrant communities who are the rising tide that's lifting their boats. Hmm.

Part of this is the simple fact that you can't really get into the nitty-gritty of the economics of mass migration and the human psychology that accompanies that in four minutes. And so you wind up only briefly touching on topics that turn out to be very important - such as do low-skilled immigrants to a country, in their role as new consumers, spark enough demand for goods and services that they themselves cannot provide that they take more slack out of the labor market than they put into it? Do any front of the house restaurant jobs or low-level management jobs that are created genuinely go to the same people who were ousted from the back of the house or individual contributor jobs that the immigrants now hold, or do they go to other people whose skills or a more immediate match?

One of the recurring factors of the immigration debate is the disconnect between personal anecdote and aggregate data. And this allows the who side to both be right, and to talk past the opposition, due to the different level of granularity. It does seem strange that not-quite lowest skilled workers who found themselves pushed up the employment hierarchy by immigrant labor would be complaining about that fact. And so I suspect that that the new "native-born" waiters in a restaurant that uses immigrant labor in the back of the house aren't the same people who would have been hired for the back of the house had the immigrants not been there. And it's a safe bet that the people who are displaced from the sorts of jobs that lower-skilled Americans often complain have been lost to immigration - things like food processing and low-skill manufacturing jobs - haven't found themselves in the role of supervising those new hires, or being liaison to English-speaking upper management - a job for which a knowledge of the immigrants' language is just as important as "a bit stronger English skills."

If you're a person who has never studied economics, let alone only a high-school graduate (or dropout, for that matter), it's likely that a simple understanding of "supply and demand" is all you've ever been taught. And Ms. Cooke is right that the actual economy is more complicated than "when the supply of labor goes up the price of labor goes down." But it's not helpful to presume that this knowledge is widely-available, yet being ignored, when it's somewhat difficult to find explanations of what's actually going on that are accessible to laypersons.

Whether or not low-skilled workers born in the United States are competing with low-skilled immigrant labor for the same jobs is a matter of individual's experience and perception of the world. To the person who was turned down for a job at particular employer, who sees people who look like immigrants going to and from that workplace on a daily basis, and has had to settle for a job less to their liking or less closely-matched to their skills, an academic saying that someone like them might be supervising those workers is cold comfort. And telling them they should blame distant and impersonal "structures of our economy" doesn't give them any actions they can take to better their circumstances in the here and now.

This isn't to say that the aggregate data about how immigration impacts the country overall isn't important. But in a representative republic, studies that tell the people who are suffering that they're a small enough segment of the population that they can be safely ignored (which while likely never the intent, is often the result) simply reinforce the idea that they've been forgotten - or are being deliberately sacrificed.
Immigration undoubtedly has economic costs as well, particularly for Americans in certain industries and Americans with lower levels of educational attainment. But the benefits that immigration brings to society far outweigh their costs, and smart immigration policy could better maximize the benefits of immigration while reducing the costs.
An Open Letter from 1,470 Economists on Immigration
And that leads them to vote for people who will confirm their understanding of the world and offer to help them with it. If that's a problem, it's time to start doing something about the problems that are creating it.

Friday, August 4, 2017

Sunset Comes Early

The sky has not been quite blue this week. And the Sun fades to an ember before it reaches the treetops, although there is still some distance to the true horizon.

British Columbia is burning, and the smoke has blanketed the Puget Sound region in a haze that has prompted warnings about the air quality for the young, the ill and the elderly. But it has had another effect, one less talked about, but perhaps no less important. It's been very warm in Seattle for the past few days, but the haze has tamed the heat somewhat. Records were met, but all-time highs didn't materialize, and the evenings cooled more rapidly than is normal. The heat is fading, but not gone, and the haze should have a few more days. And then things will return to normal. For a time, anyway.

Wednesday, August 2, 2017

Grapes

I think I may try to teach myself how to shoot still life this summer/autumn. Not because I want to be a food photographer, but just because it seems interesting.