Nov. 30th, 2016

tim: "System Status: Degraded" (degraded)
[CW: suicide]

Elizabeth Waite was a trans woman who committed suicide last week. I did not know Elizabeth, but several of my friends did. In an article for the Daily Beast, Ben Collins described what happened after she died (CW if you follow the link to the article: it quotes extremely transmisogynistic and violent comments and images, including some that incite suicide.)


The night the article describes, I sat in my office after work with Elizabeth's profile open in a tab, watching the stream of hateful comments pour in almost faster than I could report them to Facebook. My friends had mentioned that members of an online forum known for terrorizing autistic trans women were flooding her profile (particularly her last post, in which she stated her intention to commit suicide) with hateful comments. Since I didn't know Elizabeth and wasn't emotionally affected by reading these comments in the same way that I would have been if I had known her, I felt that bearing witnesses and reporting the comments as abuse was work that I could usefully do. Since many of the comments were obviously from fake accounts, and Facebook is well-known for its desire for good data (read: monetizable data), specifically accounts attached to the names people use in everyday life, I reported those accounts as fake as well.

And later that night, I watched dozens and dozens of emails fill my inbox that were automated responses from Facebook's abuse reporting system. Most of the responses said this:


Thank you for taking the time to report something that you feel may violate our Community Standards. Reports like yours are an important part of making Facebook a safe and welcoming environment. We reviewed the comment you reported for displaying hate speech and found it doesn't violate our Community Standards.
Please let us know if you see anything else that concerns you. We want to keep Facebook safe and welcoming for everyone.


screenshot of the quoted text

Because the posts in question were eventually made private, I can't quote the comments about which a Facebook content reviewer said "it doesn't violate our Community Standards", and in fairness to the person or people reviewing the comments, some of the comments weren't obviously hate speech without the context that they were in a thread of people piling on a dead trans woman. Facebook lacks a way to report abuse that goes beyond "the text of this individual comment, in the absence of context, violates Facebook's Community Standards." That's part of the problem. If trans people were in positions of power at Facebook, you can bet that there would be a "report transmisogynist hate mob" button that would call attention to an entire thread in which an individual was being targeted by a coordinated harassment campaign.

Likewise, even though Facebook is notorious for harassing trans people for using the names we use in everyday life as our account names, when I reported an account with the name "Donny J. Trump" for impersonation, I got an email back saying that the account would not be suspended because it wasn't impersonating anybody:

screenshot of the aforementioned text

Facebook's tools don't address this problem. Imagine you're the family member of a trans woman who just died and whose profile is receiving a flood of hateful comments. Dozens of users are posting these comments -- too many to block, and anyway, what good would blocking do if you don't have access to the deceased person's account password? The comments would still be there, defacing that person's memory. Reporting individual comments has no effect if the harassment is conducted by posting a series of memes that aren't necessarily offensive on their own, but have the effect of demeaning and belittling a person's death when posted as comments in response to a suicide note. And getting an account converted to a "memorial account" -- which allows someone else to administer it -- can take days, which doesn't help when the harassment is happening right now. Again: you can look at Facebook and know that it's a company in which the voices of people who worry about questions like, "when I die, will people on an Internet forum organize a hate mob to post harmful comments all over my public posts?" are not represented.

But Facebook doesn't even do what they promise to do: delete individual comments that clearly violate their community standards:

Facebook removes hate speech, which includes content that directly attacks people based on their:

Race,
Ethnicity,
National origin,
Religious affiliation,
Sexual orientation,
Sex, gender, or gender identity, or
Serious disabilities or diseases.


Out of the many comments in the threads on Elizabeth Waite's profile that clearly attacked people based on their gender identity or disability, most were deemed by Facebook as "doesn't violate our Community Standards."

At this point, Facebook ought to just stop pretending to have an abuse reporting system, because what they promise to do has nothing to do with what they will actually do. Facebook's customers are advertisers -- people like you and me who produce content that helps Facebook deliver an audience for advertisers (you might think of us as "users") are the raw material, not the customers. Even so, it's strange that companies that pay for advertising on Facebook don't care that Facebook actively enables this kind of harassment.

If you read the Daily Beast article, you'll also notice that Facebook was completely unhelpful and unwilling to stop the abuse other than in a comment-by-comment way until one of the family members found a laptop that still had a login cookie for Elizabeth's account -- they wouldn't memorialize it or do anything else to stop the abuse wholesale in a timely fashion. What would have happened if the cookie had already expired?

Like anybody else, trans people die for all kinds of reasons. In an environment where hate speech is being encouraged from the highest levels of power, this is just going to keep happening more and more. Facebook will continue to refuse to do anything to stop it, because hate speech doesn't curtail their advertising revenue. In fact, as I wrote about in "The Democratization of Defamation", the economic incentives that exist encourage companies like Facebook to potentiate harassment, because more harassment means more impressions.

Although it's clearly crude economics that make Facebook unwilling to invest resources in abuse prevention, a public relations person at Facebook would probably tell you that they are reluctant to remove hate speech because of concern for free speech. Facebook is not a common carrier and has no legal (or moral) obligation to spend money to disseminate content that isn't consistent with its values as a business. Nevertheless, think about this for a moment: in your lifetime, you will probably have to see a loved one's profile get defaced like this and know that Facebook will do nothing about it. Imagine a graveyard that let people spray paint on tombstones and then stopped you from washing the paint off because of free speech.

What responsibilities do social media companies -- large ones like Facebook that operate as completely unregulated public utilities -- have to their users? If you'd like, you can call Facebook's billions of account holders "content creators"; what responsibilities do they have to those of us who create the content that Facebook uses for delivering an audience to advertisers?

Facebook would like you to think that they give us access to their site for free because they're nice people and like us, but corporations aren't nice people and don't like you. The other viewpoint you may have heard is: "If you're not paying for the product, then you are the product." Both of these stories are too simplistic. If you use Facebook, you do pay for it: with the labor you put into writing status updates and comments (without your labor, Facebook would have nothing to sell to advertisers) and with the attention you give to ads (even if you never click on an ad).

If you're using something that's being given away for free, then the person giving it away has no contractual obligations to you. Likewise, if you are raw material, than the people turning you into gold have no contractual obligations to you. But if you're paying to use Facebook -- and you are, with your attention -- that creates a buyer/seller relationship. Because this relationship is not formalized, you as the buyer assume all the risks in the transaction while the seller reaps all of the economic benefit.


Do you like this post? Support me on Patreon and help me write more like it. In December 2016, I'll be donating all of my Patreon earnings to the National Network of Abortion Funds, so if you'd like to show your support, you can also make a one-time or recurring donation to them directly.

Profile

tim: Tim with short hair, smiling, wearing a black jacket over a white T-shirt (Default)
Tim Chevalier

May 2017

S M T W T F S
 123456
78 910111213
14151617181920
21222324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags