Facebook Isn’t Toxic, We Are.

Rocio Flores
12 min readNov 9, 2020

This is written for the person “triggered” by this title.

We are toxic.
I am toxic.
You are toxic.

You may not agree and you will probably read this defensively, and you should.

Photo by visuals on Unsplash

Lately, there has been a lot of talk about the role that social media has played in the development of human nature and the course that our society is taking as a result of this technology. We are constantly “discussing” these effects in books, documentaries, commentary channels, and by us screaming at each other ON social media. From documentaries like The Social Dilemma to your uncle’s post about the liberal media, we seem to all be pretty clear on one thing: Facebook is toxic.

To be completely fair, there are MANY issues to tackle in this conversation that it’s almost too much to try and list them all out; censorship, rigged elections, extremist content, algorithm bubbles, fake news, targeting, advertising, privacy, parasocial relationships, grooming…. it’s enough to make you feel justified in blaming this giant technological corporation!

We are toxic, yet, the vast majority of the conversation ignores our human fallibility and pins the blame on technology. To be fair, technology has been the driving factor in speeding up the consequences of this issue, but the technology was created by humans; for humans.

While all these problems are occurring ON Facebook…they are HUMAN problems that can happen anywhere that there are humans. While we are experiencing a collective witness to the horrors of life ON social media, very little of what is happening is new. Social media is a telescope, it allows us to observe what has already BEEN taking place and is a tool that has led us to question our assumptions of how we had interpreted the information in the first place. What was once just a tiny dot in our dark sky is now a debate about the consistency of the gas giants we can’t seem to agree exist.

Social media is our human behavior and thought extended to technology and its exploitation is an extension of the same power struggle that continues even after we log off or uninstall Facebook.

(Don’t get me started on the mass exodus to other platforms after the 2020 Election that is currently taking place.)

Here’s my disclaimer: This post isn’t aimed at the victims of social media. While the sin of humanity is collective in the sense that none of us exist in a bubble separate from the responsibility of humanity, our individual experiences are VALID. If individuals were not harmed by social media then the harms of social media would not be an issue to discuss and any conversations about relying solely on personal responsibility vs. recognizing our collective responsibility are moot when the collective damage done can harm an individual who doesn’t even own a computer. The reality is that even people who are not on Facebook or any social media can be affected by those of us that ARE. YOUR experience of being harmed BY the human or the technological EFFECTS of social media matters. This post isn’t aimed at shaming any of us on an individual scale for our individual actions.

The conversation about social media has mostly consisted of the features of the technology and the VERY REAL and the still very much need-to-be-addressed issues of privacy, data collection, algorithms, AI, advertising, addiction, and the use of human psychology to create more technology that INTENTIONALLY preys on human behavior. Facebook employees and big social media creators have often taken responsibility for intentionally creating features and products that purposely affect human behavior and increase the toxic nature of the websites. This isn't a secret anymore. It isn’t shocking either. We know Facebook and other social media sites have been “toxic” for a while now and yet, there are still billions of people that use the sites because, at the end of the day, it’s not the site that is the issue… it’s us.

Facebook requires human behavior.

We are not afraid of artificial intelligence and robots because of their thoughts, emotions, or behavior… the terror comes from ours. There are real fears, outside of Sci-fi movies, about the expansion of artificial intelligence technology because we know that human bias plays a role in the development, and the continued learning that the programs would have access to is also biased. The realistic fears about using AI technology and it leading to furthering a racial or class divide are so much more human than the fear of robots taking over the planet. We are not born into voids. We can delete our Facebook accounts or shut down the internet or retreat into a world devoid of any technology but we are still human. We are born into worlds that use technology and where technology is becoming more and more incorporated with our human behavior but Facebook is no more the cause of human downfall than the advent of agriculture or gun powder.

We are toxic and our lack of humanity has always been the poison.

So what do we do?

Some of us may jump into the extremes of eugenics or uninstalling Facebook. Others have already spent a significant amount of time delineating very specific evidence-based steps we can take ranging from personal decisions to authoritarian restrictions. Some of us have chosen to capitalize on what we know about social media and some of us have become complacent and apathetic.

These are all incredibly human responses. We can replace the word Facebook, Social Media, or Technology with guns, farms, meat, war, murder, crime, etc…

Why do all the answers deny our humanity?

Our human response always ranges between complete human annihilation or dismissal of our human tendencies. In between there are always the people doing the work, figuring out the patterns, adding to collective knowledge, and seeking small but powerful changes to keep us all going. It is usually these people that are ignored and we keep on fighting about how to best move forward.

For a species that differentiates itself by its conscious awareness, we tend to throw that out the window when we are challenged by its effects.

We are aware of a problem. Facebook is toxic.

The first step in fixing a problem is acknowledging that it exists and we have acknowledged that Facebook is a tool that has begun to contribute to more and more societal problems. The work of understanding how this tool has been hurting instead of helping us has been done and now we have to do the work of acknowledging how we are the main drivers of this issue.

Facebook and other specific social media websites are not exclusive entities. The main attraction of a social media website is the user-generated content that keeps the website running without needing to have one individual solely responsible for its continuation. Facebook is just a very very well-curated hub of human activity. So we have to be willing to take a step back and look at human activity, understand it well and learn from it and then use that information to make changes that make sense for HUMANS.

If we don’t change human systems, we can’t change technological systems.

Many of the ways that we have attempted to rein in websites like Facebook are by subjecting them to authoritarian rule by demanding that our government enact rules about how these websites and how the technology pioneered by these sites can or can't be used. Whether or not you agree with this approach is really just another symptom of the issue itself. We can’t agree on how the government should control facebook if we still don't agree on how a government should control anything. One look at any history book will show us that as humans, we have not worked out our opinions about authority. (Or we can just look at last week.)

It is incredibly human to not agree about how to be ruled by other humans.

Another way we have tried to tackle the social media issue is by logging off or creating public pressure for the sites to change in the ways WE deem appropriate. Some of us have stayed logged on and some of us have sworn off the internet. Some changes have actually been put to work. Facebook’s founders have made changes specifically with this in mind. They know that their websites don’t work if people aren't on them and we know that Facebook works as we use it. The reality is… we are affected by how people use social media even if we are logged off and social media is affected by human behavior outside of the site itself. The fact that Facebook has played a big role in the U.S. Elections and that so much of the content on Facebook comes from non-social media websites is proof that Facebook is just a hub of human behavior.

It is incredibly human to be affected by other human’s behavior.

At the end of the day, whether Facebook itself will keep ruling the world or other sites take over… the concept of a hub of human activity where humans engage and share their humanity with each other will never stop existing. We are social creatures and both our natural and supernatural lenses of this world heavily give credence to humans needing interaction with other humans for both physical, mental, and spiritual health. We will find new ways to connect and to engage with each other, even if it means going to other websites or creating our own.

It is incredibly human to need to connect socially.

There’s a good chance that social media is not going anywhere. The internet, at least, is not going anywhere. We really only leave things in the past when we have new technology to take its place and it’s hard to imagine what a new world of technology can even mean beyond our current understanding and experience. Until we can fathom a realistic way of connecting that goes beyond tangible or technological means, “Facebook” is here to stay. The problems that sites like Facebook have given us or have made us aware of are still things that need to be addressed.

1. As long as we have human interaction we will have systems to categorize humans. When we have systems of category, we end up having systems of supremacy. It is human to categorize and this categorization leads to assumptions and biases which can then lead to abusive and harmful ideologies about categories of people we begin to deem inferior. We have to tackle supremacist ideologies and how they affect human behavior. We have to talk about the systems we allow to continue on and offline. As long as we are unwilling to admit that we ALL categorize, we will keep allowing our implicit biases to affect our interactions, on and offline.

Our collective data is worth money because it is valuable. If collecting information about the categories that we fit into meant nothing, data would not be commodified. We would not see ads that make sense or fear that the information collected from us could lead to privacy and security threats if our data wasn't deeply tied to real assumptions and real biases that we as humans have.

By learning about how we categorize ourselves and others and how these categories then translate into information, we can then use that information.

We have to be honest about our human tendency to categorize.

2. As long as we have information, that information will be affected by our human understanding and will be used by humans with human intentions. We have to change how we interact with information and recognize the ways that we, as humans, use information. Information falls into several categories. For the purposes of this article, I will put them into 3 categories: Facts, Opinions, and Questions. We either know and all agree on the truth of a piece of information and we call it a fact or we disagree and therefore call it an opinion. When we are in the process of theorizinging about it, we are asking questions. By acknowledging that any new information that we are presented with can fall into one of these categories…we can understand how information is then used against us.

One of the biggest complaints against social media is the use of algorithms. Algorithms choose what information is presented to us. When we interact with that information, that interaction becomes a category of information that is then used to continue to customize the information we continue to be presented with. This cycle is vicious, accurate, and powerful but it’s not new. Propaganda has been used by governments and interest groups for ages. The understanding of human psychology has been used against humans for ages. Indoctrination has been occurring to individuals and groups for ages.

Information is powerful and it is an incredibly human experience to have it used against us.

3. We have a choice. The reason that we throw ourselves at extreme solutions is that we recognize that we have a choice we can make… we just don't always feel like we have many choices or we don’t like them. We either feel like this problem is too big or humanity is just too far gone. We are rocked into complacency out of the magnitude that these issues tend to claim. If Facebook is this all encroaching virus that is ultimately leading us towards human destruction then… does it really matter what I do online? Or if the ONLY way to ever save humanity is to not engage with anything technological then how can I force the rest of humanity to go along with my off-grid fantasy? How frustrating is it to know that, we have these very real problems happening and yet, the majority of Facebook is just regular people sharing memes and not seeing the toxicity of the situation….not seeing the toxicity of their own actions.

It is incredibly human to be frustrated by a lack of control. Fortunately, it is also incredibly human to seek out choices and take control.

Our choices when it comes to Facebook and social media can range from the individual to the collective but MUST stem from first understanding and placing humanity at the CENTER of any decision.

  • We can set limits on our online usage and how and who we interact with online while understanding that others probably won’t take these steps.
  • We can change our alert and privacy settings while understanding that the settings have been created with the specific purposes to make our apps more addictive and personalized which makes sticking with these boundaries difficult.
  • We can hijack the algorithm by purposely seeking contradicting information while recognizing that the polarization of issues is not always equally valid or indicative of the true nature of the issue as they are presented online.
  • We can purposely follow and unfollow accounts to tailor the information we are presented in order to take control of the effect the messaging can have on our personal and relational health while recognizing that others are still affected by the messaging we are not being presented.
  • We can recognize that we can look outside of social media for information and seek out facts while recognizing that we all have different levels of privilege and access to informational and educational resources.
  • We can take charge and actively seek to share our message and present information to others while understanding that not everyone will be open to the information or even see it.
  • We can be an active force to make a change outside of social media to advocate for those affected by messaging that began online while taking into consideration that this advocacy may inspire or be viewed completely differently online.
  • We can create strong relationships both online and offline while recognizing that supremacist ideologies run deep in human behavior and need to be addressed both online and offline.
  • We can advocate for regulations on evidence-based changes while recognizing the deep mistrust humans have with authority and how many will not easily trust the changes.
  • We can be critical of changes that are not based on evidence, that ignore human behavior, and that are performative and don’t lead to substantial changes while recognizing that people may not be aware of this and may genuinely feel that they are making a difference.
  • We can hold people accountable while recognizing that people can change and we can create systems to praise change while not forgoing accountability.
  • We can see past our personal experiences and recognize the collective experience and address those patterns before seeking authoritarian means of change while recognizing that our personal experiences still matter and will hold a high level of implicit bias that may affect our willingness to change these patterns.
  • We can admit when individual responsibility is not enough and we need to advocate for collective change while still taking on personal responsibility and not expecting collective changes.

The possibilities are endless.

Whether we log on to Facebook today and look at the screens differently or we choose not to log on and just look at each other differently…we have a choice. We are human and at the end of the day, with or without Facebook, we still have to figure out how to engage with each other in ways that are not toxic and that first require that we admit when we are toxic. Facebook doesn’t matter, but we do. Our friends, family, co-workers, citizens…matter.

Humanity matters.

--

--

Rocio Flores

I’m a self-development coach at rociioflores.com. I like to share my thoughts on facebook where I annoy family members with my rants. So they are here now:tada!