Facebook and Human Rights
Sitting in his Harvard dorm room conjuring up a way to connect networks of college friends, it is doubtful Mark Zuckerberg ever imagined what Facebook would become. While Facebook’s role in helping Donald Trump win the 2016 presidential election is well documented, Facebook becomes more powerful daily, even influencing privacy and human rights. A recent op-ed by Chris Hughes, a Facebook co-founder, says Facebook should be broken up – that the social media network has become too powerful, and that power concentrated in so few hands corrupts.
Technology is serving a net positive for many across the globe. But humans have darker tendencies, with some exploiting technologies for evil. Even with a migration from Facebook (and now Instagram) to other platforms, Facebook is still the most powerful social media network. The Interfor team will now explore recent news about the platform.
Facebook is rewriting history
We’ll look back at this period as a treasure trove of data and history. Historians and digital archaeologists will understand how we lived our lives – unless Facebook’s algorithms erase those digital memories. A recent Atlantic article says that is exactly what is happening in relation to human rights violations that have been documented digitally, like a throwback to George Orwell’s version of altering history in his classic 1984.
In Facebook’s and other tech companies’ defense, their algorithms are effectively taking down “questionable” content (such as videos by extremist organizations like ISIS broadcasting their killings). The challenge is that these images and videos, while often times gruesome, serve as evidence in building the case against terrorist organizations and brutal regimes like Bashar al-Assad’s in Syria. While this is unfolding, tech companies must make tough decisions that will change history.
The new marketplace for stolen antiquities
If Facebook existed after World War II, it might have served as a marketplace for the sale of artwork looted by the Nazis. Today, the social media platform has become a marketplace for antiquities looted during the recent wars in the Middle East. Almost 100 Facebook groups are peddling these looted goods via WhatsApp (owned by Facebook). It is not surprising that terrorist organizations such as ISIS mastered social media to recruit and terrorize their foes and are now using these platforms to monetize.
Looting in war zones has been going on for centuries, but Facebook has enabled buyers and sellers to connect across the globe. While Facebook has taken a few steps to shut down some groups, more needs to be done.
Despite talk about combating extremists, Facebook is helping them
In a political move, Facebook banned some right-wing extremists. Despite the congratulatory back slapping, its algorithms are auto-generating videos promoting extremist groups. As this CBS News article states, “Facebook likes to give the impression that it’s staying ahead of extremists by taking down their posts, often before users even see them. But a confidential whistleblower’s complaint to the Securities and Exchange Commission obtained by The Associated Press alleges the social media company has exaggerated its success. Even worse, it shows that the company is inadvertently making use of propaganda by militant groups to auto-generate videos and pages that could be used for networking by extremists.”
Facebook will not release access to its data. For human rights groups following these trends it’s been a black box. Like other controversies involving the massive social network, the public’s only means of influencing Facebook’s decisions is to stop using its platform (or Instagram).
Some vocal opponents say Facebook should be dismantled – that the social media platform is too pervasive and has grown too powerful. Antitrust allegations against tech giant Microsoft in the 1990s were about fair competition in the marketplace. Today’s scenario is not only about capitalist competition but about an organization with the potential to alter history. We’re dealing with a challenge we may not be prepared for or even fully understand.