A New York Times article describes a complex rulebook to help employees decide whether to address political speech on the site. The decision process is daunting, and guidelines include extensive examples of content—text and visuals—that may indicate hate or inspire violence.
The author questions whether Facebook employees are making rational, consistent decisions:
The guidelines for identifying hate speech, a problem that has bedeviled Facebook, run to 200 jargon-filled, head-spinning pages. Moderators must sort a post into one of three “tiers” of severity. They must bear in mind lists like the six “designated dehumanizing comparisons,” among them comparing Jews to rats.
Others say Facebook has too much power because the company controls speech in international political situations. For example, before an election in Pakistan, during a 24-hour media blackout, Facebook may have been the go-to source. Before this time, Facebook distributed 40 pages of “political parties, expected trends, and guidelines” to its employees. But guidelines sometimes contradict each other, and Facebook relies of Google Translate, which may not be accurate or precise enough.
MindTap is included with Cengage Unlimited.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.