Thursday, April 4, 2019

Show Me

Even if an ad is targeted broadly, Facebook will serve it to the audiences most likely to click on it, generalizing from information from their profile and previous behavior. The system builds correlations to find this ideal audience: if techno fans are particularly likely to click on a specific ad for headphones, that ad might be served more to other techno fans in the future, even if it wasn’t an explicit targeting parameter.
[...]
Housing ads with a photograph of a white family, for instance, were apparently served to more white users than the same ad with a black family.
[...]
An ad for lumber industry jobs was shown to an audience that was 90 percent male, while ads for supermarket cashiers reached an 85 percent female audience.
"Facebook’s ad delivery could be inherently discriminatory, researchers say"
Based on the title of the article from The Verge, you can see where this is going. And it raises an interesting question. Are advertising platforms required to place advertising in from of people who are less likely to respond to it, when that correlates with some or another protected category?

It's easy for people to go after Facebook, because (in my opinion) of the general suspicion that Facebook has some untoward level of control over people's lives. But the question is bigger than that, because it would technically apply to any form of advertising. If I run an ad in The Atlantic, for instance, I suspect that it's going to reach a Whiter, more affluent audience than the same advertisement in Ebony. Likewise, if I buy space on a website devoted to tabletop role-playing games, I'm going to have that same effect.

And so that does raise the question of whether or not targeting some or another specific audience should always be viewed as suspect. Because to the degree that factors that lead an advertisement to be more effectively displayed for specific audiences are not always going to be neutral in regards to race, sex, income levels, political affiliation et cetera, selecting the people most likely to respond to an ad to see the ad will likely always be somewhat "inherently discriminatory." Because if people are more likely to click on ads that show people who resemble themselves, or that they otherwise identify with, it's to be expected that an algorithm that understands demographics will wind up reinforcing this tendency.

The internet, or internet advertising, to be more precise, is not going to undo the sorts of racial and sexual stereotyping that's become part and parcel of modern society. That job is going to have to belong to the public as a whole, and they're going to have to be actively invested in it. Forcing advertising platforms to show people advertisements that they're statistically unlikely to be interested in, effectively ignoring the data that comes back to them, isn't going to create an egalitarian society. People, changing their preferences, might. But I understand the impulse to go for the easier lift.

No comments: