In an article in the Israeli newspaper Haaretz, it was reported that last month, Tali Coral, the operator of the page When He Pays, was blocked again by Facebook, only two weeks after the social network had admitted that previous sanctions against her and the page, which is used to combat the Israeli prostitution industry and its customers, were imposed by mistake.
Image taken from When He Pays Facebook-page campaign
According to Coral, she was blocked from publishing on the campaign-page for three days because a customer reported her for one of her posts: she published a screenshot of an email she received from him under a pseudonym in which he described to her in details his encounters with a “service provider” while paradoxically reprimanding her for accusing him of being a client of the sex industry. Two weeks beforehand, Coral’s personal account was blocked for 24 hours, and some of the posts on the campaign-page were deleted due to “violation of Facebook’s community standards”. Since it was her third warning, the social network threatened to delete the entire page.
When He Pays is a part of a larger campaign created by Coral in 2014 while she was a part of the Israeli non-profit NGO Machon Toda’a (Awareness Centre). The NGO assists victims of the sex industry and combats prostitution primarily through raising public awareness in order to change social and constitutional perceptions about prostitution and the people (mainly women, but not only) who engage in it. The main campaign is conducted through a blog-page on Tumblr, but, according to Coral, in order to get to a mass exposure in an awareness-raising campaign, Facebook is the solution, and subsequently, also twitter. So, the three interconnected pages were created to turn the spotlight on the customers of the Israeli prostitution industry. Coral posts authentic quotes (anonymously) that are taken from different online-forums of paid-sex consumers from the largest Israeli sex portal, Sex Adir (Great Sex), in which clients of the sex industry compare their experiences and provide “recommendations” about the women they had encounters with. The Facebook-page, specifically, is used as a platform to engage the public in discussions and to publish related material and local and global news about prostitution in general.
There are two main issues that are especially interesting in this article. First, Coral practices precisely what Shirky’s (2010) argues for: she makes use of social media (specifically the Facebook-page) both as a platform for information spreading (posting the quotes and additional prostitution-related news) and as a platform for public conversation in order to support civil society in the Israeli public sphere about prostitution. Her practice is grounded in the understating that a legal change (or any kind of social change) cannot occur without a public perception change, and a perception change cannot occur from solely communicating information, but only as an addition to active conversation, especially regarding an issue such as prostitution that is grounded in false perceptions, stereotypes, and prejudice, as she states in an interview for S-Emek podcast. This also follows Katz and Lazarsfeld (1955) known two-step flow of communication: people form political opinions not solely be being exposed to information (the first step), but only after the second social step in which opinions are echoed by friends, family-members, colleagues, etc. That is the step, Shirky (2010) argues, where social media can make a difference in the long-run because it is the place in which people nowadays articulate and debate different opinions (p. 4), and, thus, Coral uses one of the largest social platforms in the world today—Facebook.
However, what happens when one of the largest social platforms in the world that has become a place for a public debate (i.e. public sphere) is, first and foremost, a commercial company that has to answer to multiple stakeholders? For Facebook, it led to practice censorship, which is the second interesting issue in the article. As Meikle (2016) reveals through discourse analysis, users, readers, and shareholders are only a few of the stakeholders the network answers to, and each “answer” conflicts with the other; on the one hand, controversial content is financially beneficial because it attracts users and becomes viral, but on the other hand, it is not financially beneficial because the same content may discourage shareholders and commercial companies from investing in the network. In an attempt to bridge these, the network established “community standards” in which it simultaneously encourages its users to raise awareness about issues that are important to them but through “respectful behaviour”, that is, as long as raising awareness does not include nudity, hate speech, or violent language and graphic content. Deleting posts and blocking websites, though, seems to still be done through automatic algorithms, and not manually as the network claims, because there are still cases of blocking and deletions that seem to be arbitrary and not systematically. Thus, what happens when raising awareness requires the use of a violent language, as with Coral’s campaign? The innovative component of Coral’s campaign is the publication of the brutal language of the customers of the prostitution industry that is hidden from the public eye, which is exactly what horrifies the public and, hopefully, also what will make them look at this industry from a different perspective (i.e. change their perception). Without these quotes, the campaign could have used just another information dissemination tactic, which can be argued to not be as effective in such tabooed and controversial issues and, hence, the on-going false perceptions of the public on the prostitution industry.
Even though Coral still shows empathy to the network’s situation from the understanding that it is a private commercial business and not a philanthropic one, she admits that the situation is still problematic for this campaign (the network responded to her case only because the case was published in the press). On that note, should we hope that Facebook builds a better mechanism that would be able to differentiate between those who publish violent content for “sadistic pleasure or to celebrate or glorify violence”, and those who are condemning it or raising awareness about it? Or should we hope for no censorship at all? That, however, may also be problematic, as in the case of Reddit, for example, in which Massanari (2015) revealed that the platform’s deliberate lack of censorship is precisely what brought it a different kind of critique, specifically in the cases the researcher investigated—#Gamergate and The Fappening. What, then, should we hope for in a social platform that can be a place for public conversation that could, hopefully, lead to public mobilisation and social change?
P.s – Here is a short and interesting TED talk by Pr. Shulamit Almog about raising awareness to the subject of prostitution by changing the narrative about it:[youtube]https://www.youtube.com/watch?v=FDp4MbfzwRk[/youtube]