Probing the Morality and Responsibility of Social Media and Search Engines

Probing the Morality and Responsibility of Social Media and Search Engines

Another in a series of reports from Common Cause New York's 2017 summer interns.

Part of a series.

Editor’s note: Each summer, Common Cause New York is fortunate to be infused with the talents and energy of a group of interns. They help us with research on our issues, organizing our activists, and pretty much everything else that needs doing. As they headed back to their campuses, we asked them to reflect on their time with Common Cause and the challenges facing our democracy.

By Erica Hobby, Research and Policy Intern

In the wake of the last year’s presidential election, companies such as Facebook and Google came under fire for their perceived role in increasing partisanship and perpetuating fake news. Our office decided to explore these issues, and how far they extend, to gain a fuller understanding of the problem and potential solutions.

Acknowledging that these companies need to be held responsible for their actions, particularly because of the pervasive nature of social media, we first needed to grasp what problems these companies perpetuate and produce. Through an exploration of recent writing on these issues, I and another Research and Policy Intern here at Common Cause New York compiled data, analysis, and other anecdotes into a working memo, which helped us see the issues and how they are discussed for Facebook and Google in particular. Using polarization and increased partisanship as our jumping-off point, our research led us to other issues: fake news, violent and offensive content, the human aspect of the sites’ algorithms, and the mental health of content monitors.

We discovered a surprising breadth of issues related to these companies. Not only do we need to worry about putting ourselves in “echo chambers” via our Facebook accounts, but we also need to be concerned with the fact that these companies’ algorithms, the backbone of the service, are ultimately vulnerable to manipulation, which can promote certain content over other. This manipulation becomes problematic when the content forced to the top of your Facebook News Feed or the results page of your Google search is violent, fake, and/or extreme.

Beginning the process of researching these issues and how they play out in various aspects of our day-to-day life on the internet brings to light the fact that we cannot assume these companies can or will make decisions that promote our best interests. It lays the groundwork for delving into deeper questions and, ultimately, progressing towards guiding principles we should all ask these companies to adopt in order for them to assume responsibility for their sites’ content and consequences.

###

See More: Media & Democracy