You may have heard about the recent reports of a whistleblower who left Facebook and secretly copied 10,000+ internal documents on her way out.
Her name is Frances Haugen, and she was a data scientist assigned to Facebook’s Civic Integrity unit, which was responsible for monitoring issues surrounding the 2020 election.
In the 60 Minutes interview
, she revealed that the company dissolved the unit after the election. She described it as basically going like this:
“Well, the election is over and nothing major happened, so we don’t need this department anymore.” (Not her words.)
The biggest bombshell she revealed was the extent to which Facebook knew that what it was doing was harming people.
More data has come out that shows that Instagram use is directly tied to suicidal thoughts and actual suicides in teenagers.
Yes, there has been talk about this in the past, but the new data sharpens the sword and plunges it deeper.
Children are dying because of social media.
This is not OK.
NYU business professor, entrepreneur, and podcast host, Scott Galloway, argues that Instagram needs to be regulated immediately.
He makes the claim that we regulate other products that are harmful to teenagers, like smoking and vaping, so why don’t we do the same for Instagram?
I wholeheartedly agree.
In 2017 and 2018, I worked at a pediatrics practice during my second clinical placement while completing my Master of Social Work degree.
One of the top issues in the entire practice was dealing with the mental health challenges teenage girls were having. It was a weekly occurrence.
The most common challenges these teenagers were reporting were depression and anxiety.
And one of the reasons almost always provided when these girls were asked about their depression and anxiety?
Instagram, the content they saw there, and the bullying they dealt with on the platform.
You may still not be convinced that this is a major issue worth government regulation.
I would argue that any product that makes people want to kill themselves in massive numbers is worth addressing, but let’s explore this a bit more.
Bad content doesn’t just affect teenagers.
Facebook has known that they have a content problem for years.
They know the amount of hateful content that is on their platform, and according to the 60 Minutes interview with the courageous whistleblower, they know they only address a few percentage points of it.
And they’re OK with that.
But hateful content isn’t just inconvenient. It actually costs lives.
Read this if you can stomach it:
They posed as fans of pop stars and national heroes as they flooded Facebook with their hatred. One said Islam was a global threat to Buddhism. Another shared a false story about the rape of a Buddhist woman by a Muslim man.
The Facebook posts were not from everyday internet users. Instead, they were from Myanmar military personnel who turned the social network into a tool for ethnic cleansing, according to former military officials, researchers and civilian officials in the country.
Members of the Myanmar military were the prime operatives behind a systematic campaign on Facebook that stretched back half a decade and that targeted the country’s mostly Muslim Rohingya minority group, the people said. The military exploited Facebook’s wide reach in Myanmar, where it is so broadly used that many of the country’s 18 million internet users confuse the Silicon Valley social media platform with the internet. Human rights groups blame the anti-Rohingya propaganda for inciting murders
and the largest forced human migration in recent history.
It is disgusting that Facebook knows that this happens on their platform and does next to nothing to deal with it.
We’re not talking about fake news. We’re talking about genocide.
Unfortunately, Facebook has a history of only acting when forced to do so.
If you use Facebook’s behavior to determine their intent, their goal is to grow as fast as possible so that they can make as much money as possible.
But growth at all costs is not progress.
This is why we need to do something about this.
And this is why you are not safe on Facebook–and the other platform owned by Facebook, Instagram.