Tuesday, June 11, 2019

A Matter of Trust

At one point in our interview, [the Guardian senior technology reporter Julia] Wong told me she thinks the tech platforms are in some ways damned if they do and damned if they don’t regulate content. It’s tough because it’s a slippery slope. The shootings and the pedophilia, those are the easy calls. After that it starts to get really messy really fast, and then you’re talking about huge powerful companies with huge influence over literally what we see and read and think.

However, after I read this Buzzfeed story last month about the 14-year-old girl with a million followers who tells gay people and Muslims to kill themselves and promotes so-called red pill beliefs that women are inferior, and the story said YouTube gave her channel a “strike” (that meant she couldn’t upload for a whole entire week) I did feel like maybe I would appreciate a bias against that content.
Molly Wood. “Are YouTube’s excuses for terrible content finally wearing thin?
If there's one thing that irks me about "dystopian" fiction, it's that the Evil Overlords™ are so often just that: Evil. Oppression of the innocent (and morally upright) masses is an end in itself, apparently simply so black-hearted government apparatchiks can enjoy living the high life to the sound of the public groaning under an unjust system. As I understand it, however, governments aren't bastards for the sake of being bastards; they're attempting to make a better world, and, as I've heard many people put it, some things are too important to worry about preserving people's rights. (After all, the dead have no use for them...) Of course, portraying oppressive regimes and being responsive to what some segment of the population wants from it paints the public as at least partially complicit in the whole enterprise, and so I can understand why it doesn't happen all that often. But in the real world, sometimes, it is some or another segment of the public that drives the dystopia.

I understand the impulse to allow "huge powerful companies" to wield "huge influence over literally what we see and read and think," if the result is that a reactionary teenager "who tells gay people and Muslims to kill themselves and promotes so-called red pill beliefs that women are inferior," and others like them, are kept out of the broader marketplace of ideas. But that presupposes a faith that this "huge influence" will always and only be deployed in ways that are likewise acceptable. A formal government program of censorship or corporate initiatives to keep the public (or advertisers and investors) happy will always be focused on the "terrible content." It's what makes something terrible that's going to be the stumbling block. The idea that hate speech and bigotry are self-evident can lead to an understanding that once the people with the red pens show themselves to be publicly-minded, they can then be trusted to do the public's work.

It can be said that it's the responsibility of each and every one of us, as individuals, to simply ignore the haters and go on about our business. But that is, effectively, an impossible request. And the consequences of people failing to live up to that assumed responsibility can be grievous. Someone spends a bit too much time taking "Soph's" videos to heart; and then there is violence. And for many people, the choice between shutting down Soph's access to abroad audience and recriminations over injuries or deaths is an easy one; pull the plug on her and be done with it. But the costs of freeing people from a responsibility that they're potentially unequipped to handle are giving power and relaxing accountability. Which is perfectly legitimate; it's simply risky. It's possible to remove power from a party that it's been given to, but it requires caution; and history tells us that people are often incautious in that regard.

No comments: