Moral obligation to leave Facebook and other social media platform

Over the past few years, Facebook has unintentionally been involved in several controversies ranging from data breaches to the spread of hate speech, propaganda, and fake news. In addition, Facebook, and other social media, have also been linked to worsening levels of depression and anxiety. In the article, “Do you have a moral duty to leave Facebook?” by philosopher S. Matthew Liao, he questions if individuals have a moral duty to leave Facebook due to the platform’s controversies. Here is the link to the article:

Don't use plagiarized sources. Get Your Custom Essay on
Moral obligation to leave Facebook and other social media platform
Just from $13/Page
Order Essay

Read the article carefully, considering the arguments presented, and how these might relate to you and/or society. Then, in a college-level argument essay, respond to the following question:

Should people have a moral obligation to leave Facebook and other social media platforms?

Provide reasons that are supported by examples from your own experience and/or observations to support your argument. Summarize and cite evidence from the text and explain how it supports your argument.

 

You must summarize and give credit to the assigned article. You cannot rely on any other sources.

 

Consider how personal experiences and/or observations support your response.

You cannot get help from the tutoring center or your teacher on this assignment. You can get help from your classmates.

 

Your essay should be word-processed, double-spaced, and in a standard 10-12-point font. There is no required length. Please include a heading in the upper right hand corner of your paper which includes your Banner ID#, date, CCDE 110, and Common Exit Essay

 

 

 

 

 

 

 

 

5/14/2020 Opinion | Do You Have a Moral Duty to Leave Facebook? – The New York Times https://www.nytimes.com/2018/11/24/opinion/sunday/facebook-immoral.html 1/2 https://nyti.ms/2zqSUx8 Do You HaveaMoral Dutyto LeaveFacebook? The platform has been used to disrupt elections, disseminate propaganda and promote hate. Regular users should ask if they are implicated in these failings. By S. Matthew Liao Dr. Liao is a philosopher. Nov. 24, 2018 I joined Facebook in 2008, and for the most part, I have benefited from being on it. Lately, however, I have wondered whether I should delete my Facebook account. As a philosopher with a special interest in ethics, I am using “should” in the moral sense. That is, in light of recent events implicating Facebook in objectionable behavior, is there a duty to leave it? In moral philosophy, it is common to draw a distinction between duties to oneself and duties to others. From a selfregarding perspective, there are numerous reasons one might have a duty to leave Facebook. For one thing, Facebook can be time-consuming and addictive, to no fruitful end. In addition, as researchers have demonstrated, Facebook use can worsen depression and anxiety. Someone who finds himself mindlessly and compulsively scrolling through Facebook, or who is constantly comparing himself unfavorably with his Facebook friends, might therefore have a duty of self-care to get off Facebook. From the perspective of one’s duties to others, the possibility of a duty to leave Facebook arises once one recognizes that Facebook has played a significant role in undermining democratic values around the world. For example, Facebook has been used to spread white supremacist propaganda and anti-Semitic messages in and outside the United States. The United Nations has blamed Facebook for the dissemination of hate speech against Rohingya Muslims in Myanmar that resulted in their ethnic cleansing. Facebook also enabled the political data firm Cambridge Analytica to harvest the personal information of millions of voters in the United States so they could be targeted with personalized political advertisements. A significant amount of fake news can be found on Facebook, and for many users, Facebook has become a large echo chamber, where people merely seek out information that reinforces their views. Some people might think that because they mostly share photos of their cats on Facebook, such concerns do not apply to them. But this is not so, for three reasons. First, even if one does not contribute directly to the dissemination of fake news or hang out in echo chambers, simply being on Facebook encourages one’s friends to stay on Facebook, and some of those friends might engage in such activities. This influence on others is known as a (positive) network effect, where increased numbers of people improve the value of a product. Second, by being on Facebook one serves as a data point for Facebook’s social media experiment, even if one encounters none of Facebook’s experimental manipulations. In doing so, one could be helping Facebook to refine its algorithms so that it can better single out specific individuals for certain purposes, some of which could be as nefarious as those of Cambridge Analytica. Consider an analogy. When testing the safety and efficacy of new drugs, subjects are randomly assigned either to an experimental group or a control group, and only subjects in the experimental group receive the new drug. Nevertheless, the subjects in the control group are essential to the experiment. Third, using Facebook is not just an individual action but also a collective one that may be akin to failing to pay taxes. A few people failing to pay taxes might not make much of a difference to a government’s budget, but such an action may nevertheless be wrong because it is a failure to participate in a collective action that achieves a certain good end. In a 5/14/2020 Opinion | Do You Have a Moral Duty to Leave Facebook? – The New York Times https://www.nytimes.com/2018/11/24/opinion/sunday/facebook-immoral.html 2/2 similar vein, choosing to remain on Facebook might not directly undermine democratic values. But such an action could also be wrong because we might be failing to participate in a collective action (that is, leaving Facebook) that would prevent the deterioration of democracy. So do we have an obligation to leave Facebook for others’ sake? The answer is a resounding yes for those who are intentionally spreading hate speech and fake news on Facebook. For those of us who do not engage in such objectionable behavior, it is helpful to consider whether Facebook has crossed certain moral “red lines,” entering the realm of outright wickedness. For me at least, Facebook would have crossed a moral red line if it had, for example, intentionally sold the data of its users to Cambridge Analytica with the full knowledge that company would use the data subversively to influence a democratic election. Likewise, Facebook would have crossed a red line if it had intentionally assisted in the dissemination of hate speech in Myanmar. But the evidence indicates that Facebook did not intend for those things to occur on its platform. The fact that those things did occur, however, means that Facebook needs to be much more proactive in fixing such problems. Will it? The recent worrisome revelation that Facebook hired an opposition-research firm that attempted to discredit protesters by claiming that they were agents of the financier George Soros is not encouraging. While there still appears to be some daylight between Facebook and what is being done on its platform or in its name, darkness is crowding in. That said, we should not place the responsibility to uphold democratic values entirely on Facebook. As moral agents, we should also hold ourselves responsible for our conduct, and we should be reflective about what we say, react to and share when we are on social media. Among Twitter users, a common refrain is “retweets are not endorsements.” In a similar manner, one might also think that “sharing” or “reacting to” are not “endorsements.” This is a mistake. By sharing or reacting to a post, even if one explicitly criticizes the post, one is amplifying the message of that post and signaling that the post warrants further attention. For now I’m going to stay on Facebook. But if new information suggests that Facebook has crossed a moral red line, we will all have an obligation to opt out. S. Matthew Liao (@smatthewliao) teaches philosophy and directs the Center for Bioethics at New York University. Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram

Homework Writing Bay
Calculator

Calculate the price of your paper

Total price:$26
Our features

We've got everything to become your favourite writing service

Need a better grade?
We've got you covered.

Order your paper