All around the walls were the signs reflecting the fast-paced culture of Facebook: "Proceed and Be Bold," "Done is Better than Perfect," "What Would You Do If You Weren't Afraid?" and the classic Facebook philosophy "Move Fast and Break Things." But above the speakers at last Wednesday's discussion there was a different message: "Imagine a World Without Hate." That's the theme of a campaign launched by the Anti-Defamation League to celebrate their centennial of fighting against bigotry and hatred.
The event was “Free Speech, Civility, and the Challenge of Cyberhate,” and the existing tensions between the fast and loose culture of the internet and the desire to build cultures of respect and dignity was evident throughout the conversation. Deborah Lauter, ADL National Director of Civil Rights moderated a discussion which included Susan Benesch of Harvard University, Monika Bickert of Facebook, and Christopher Wolf, co-author with ADL National Director Abraham H. Foxman of the book Viral Hate.
Bickert, a Global Policy Manager at Facebook, described the scope of the challenge. Facebook has over 1.1 billion users, of whom 80% live outside the U.S. and Canada. Finding a balance between freedom of speech and maintaining a safe environment for users is an ongoing concern. Benesch noted that although the internet has been active for some twenty years, there still has not been a lot of research in how to deal with inappropriate comments or activities. The old models leaned on the use of punishment or censorship- removing objectionable material or shutting down inappropriate users. Whenever such actions are taken in cyberspace, she said, there is a tendency for others to respond negatively to the censorship and to repost or pass along the material that was removed so that it gets even wider distribution.
What seems to work more effectively, Benesch noted, was working with members of the online audience to improve behavior. 80% of people online are not active haters, but sometimes pass along posts or pictures when they haven't really thought about how they might be offensive or denigrating. Before the internet, she said, it was easier for people who enjoyed certain kinds of humor to remain in their own group. Those inside the group shared common sensibilities, and those who might be offended were not likely to be exposed to objectionable statements. The bad news is that now those conversations are more public and shared over a wider audience. However, that is also the good news. There is the possibility of someone challenging those statements, pictures, or attitudes.
Wolf noted that studies have shown that someone is five times more likely to post objectionable material if they are able to remain anonymous. This is the reason that many online magazines and news sites have adopted commenting systems that require users to be identifiable. That's the reason for Facebook's "real name" policy, said Bickert. When users have their own name attached to something, they are more likely to be careful. A few areas of Facebook, known as "pages," allow anonymous posting. There have been instances when someone posted objectionable material anonymously, and when Facebook told them they could keep the post online, but would have to attach their name to it, they voluntarily withdrew it.
Wolf described the long history of ADL efforts to combat hate speech. Before the internet, groups that advocated hatred "met in dark alleys and sent out their materials in plain brown wrappers." Now, however, they are able to set up sophisticated websites to broadcast their message. Still, these hate groups represent a very small segment of the internet community. A few more people are active haters, while still others are "inadvertent" haters, unintentionally parroting or passing along abusive messages. Against these people it is now possible to muster large numbers of people who will stand up and oppose hate speech online.
"People say 'that's just the way it is'" online, said Benesch, but the reality is that the internet is constantly evolving, and people can change it. All of the speakers agreed that helping people to understand that they had power to challenge hate and to change the online atmosphere is vital. Facebook policies are aimed at making sure that people can exercise control over what they see, that they are empowered to report offensive materials, sometimes even with the opportunity to engage in conversation, mediated through Facebook, with the offending poster, and that they learn ways to engage in counter-speech.
The panelists agreed that more education in "cyber-literacy" and "cyber-civility" is needed, especially for younger people who may be new to the online world, often accessing the internet on their own devices without adult guidance. They need to learn that they are in control of what they access, that they can make decisions about the online world, that they can report abusive material and that they can speak out against it.
The internet offers some unique opportunities for studying hate speech and how it spreads. Benesch is the director of "The Dangerous Speech Project," which began as a project of the UN Special Advisor for the Prevention of Genocide, and now receives funding from the MacArthur Foundation. She noted that it is now possible to study not only the nature and content of hate speech, but to trace how it spreads, who is talking to whom, and what factors might lead something to go "viral." With greater understanding the dynamics of hate speech comes greater opportunity to know when and how to intervene, and to develop tools for people to use in their online world to counter hatred.
The ADL has a "Cyber-Safety Action Guide" on their website to provide information about how to report hate speech at a number of online venues, including Facebook, Google+, LinkedIn, Pinterist, Twitter, Tumblr, and YouTube.