When whistleblower Frances Haugen pulled back the curtain on Facebook last fall, thousands of pages of internal documents showed troubling signs that the social media giant knew its platforms could be negatively impacting youth and were doing little to effectively change it. With around 21 million American adolescents on social media, parents took note.
Today, there are more than 1,200 families pursuing lawsuits against social media companies including TikTok, Snapchat, YouTube, Roblox and Meta, the parent company to Instagram and Facebook.
More than 150 lawsuits will be moving forward next year. Tonight, you’ll hear from some of the families suing social media. We want to warn you that some of the content in this story is alarming, but we thought it was important to include because parents say the posts impacted their kids’ mental health and, in some cases, helped lead to the death of their children.
Kathleen Spence: They’re holding our children hostage and they’re seeking and preying on them.
Sharyn Alfonsi: Preying on them?
Kathleen Spence: Yes.
The Spence family is suing social media giant Meta. Kathleen and Jeff Spence say Instagram led their daughter Alexis into depression and to an eating disorder at the age of 12.
Kathleen Spence: We realized that we were slowly losing her. We really had no comprehension to how severe social media had affected our daughter. She was being drawn into this hidden space and this dark world.
It began after the spences, both middle school teachers from Long Island, New York, gave 11-year-old Alexis a cell phone to keep in touch with them after school.
Kathleen Spence: We had very strict rules from the moment she had the phone. The phone was never allowed in the room at night. We would keep the phone in the hall.
Jeff Spence: We checked the phone. We put restrictions on the phone.
Alexis Spence: I would wait for my parents to fall asleep, and then I would just sit in the hallway or I would sneak my phone in my room. I wasn’t allowed to use a lot of apps and they had a lot of the parental controls on.
Sharyn Alfonsi: And so how quickly did you figure out a way around the restrictions?
Alexis Spence: Pretty quickly.
Hoping to connect and keep up with friends, Alexis joined Instagram. Instagram policy mandates users are 13 years old. Alexis was 11.
Sharyn Alfonsi: I thought you had to be 13?
Alexis Spence: It asks you, “Are you 13 years or older?” I checked the box “yes” and then just kept going.
Sharyn Alfonsi: And there was never any checks?
Alexis Spence: No. No verification or anything like that.
Sharyn Alfonsi: If I had picked up your phone would I have seen the Instagram app on there?
Alexis Spence: No. There were apps that you could use to disguise it as another app. So, you could download like a calculator, ‘calculator’, but it’s really Instagram.
Jeff Spence: There was always some work-around.
Sharyn Alfonsi: She was outwitting you.
Jeff Spence: Right, she was outwitting us.
Kathleen Spence: She was addicted to social media. We couldn’t stop it. It was much bigger than us.
Now 20, Alexis says an innocent search on Instagram for fitness routines led her into a dark world.
Alexis Spence: It started as, like, fitness stuff. And then I guess that would spark the algorithm to show me diets, it then started to shift into eating disorders.
Sharyn Alfonsi: What were you seeing?
Alexis Spence: People would post photos of themselves who are very sickly or just very thin, and using them to promote eating disorders.
These are some of the images that were sent to Alexis through Instagram’s algorithms – which process the user’s browsing history and personal data, then push content to them they never directly asked for.
Sharyn Alfonsi: What did you learn from looking at these pro-anorexic websites?
Alexis Spence: A lot. Learning about diet pills and how to lose weight when you’re 11 and going through puberty and, like, your body is supposed to be changing. It’s hard.
Sharyn Alfonsi: When did that stop being something that you looked at and start being something that you were doing to yourself?
Alexis Spence: Within months.
Sharyn Alfonsi: Did it normalize it for you? Did you think, “Oh, well, other people are doing this”?
Alexis Spence: Yeah. Definitely. Like, they needed help. I needed help. And instead of getting help, I was getting advice on how to continue.
By the time she was 12, Alexis had developed an eating disorder. She had multiple Instagram accounts and says she would spend five hours a day scrolling through the app, even though it often made her feel depressed.
She drew this picture of herself in her diary crying, surrounded by her phone and laptop, with thoughts reading, ‘stupid, fat…kill yourself.’
Alexis Spence: I was struggling with my mental health. I was struggling with my depression and my body image. And social media did not help with my confidence. And, if anything, it made me, like, hate myself.
It all came to a head her sophomore year when Alexis posted on Instagram that she didn’t deserve to exist. A friend alerted a school counselor.
Kathleen Spence: That was the scariest day of our lives. I got a call to come to the school. And I went there and they were just showing me all of these Instagram posts of how Alexis wanted to kill herself and hurt herself. And if Instagram is really — has all the software to protect them, why was that not flagged? Why was that not identified?
This previously unpublished internal document reveals Facebook knew Instagram was pushing girls to dangerous content.
It says that in 2021, an Instagram employee ran an internal investigation on eating disorders by opening up a false account as a 13-year-old girl looking for diet tips. She was led to content and recommendations to follow ‘skinny binge’ and ‘apple core anorexic.’
Other memos show Facebook employees raising concerns about company research that revealed Instagram made 1-in-3 teen girls feel worse about their bodies and that teens who used the app felt higher rates of anxiety and depression.
Sharyn Alfonsi: What was it like when you saw those Facebook papers for the first time?
Kathleen Spence: Sickening. The fact that I was sitting there, struggling and hoping to save my daughter’s life. And they had all these documents behind closed doors that they could’ve protected her. And they chose to ignore that research.
Attorney Matt Bergman represents the Spence family. He started the Social Media Victims Law Center after reading the Facebook papers and is now working with more than 1,200 families who are pursuing lawsuits against social media companies like Meta.
Matt Bergman: Time and time again, when they have an opportunity to choose between safety of our kids and profits, they always choose profits.
Next year, Bergman and his team will start the discovery process for the federal case against Meta and other social media companies, a multi-million dollar suit that he says is more about changing policy than financial compensation.
Bergman spent 25 years as a product liability attorney specializing in absestos and mesothelioma cases. He argues the design of social media platforms is ultimately hurting kids.
Matt Bergman: They have intentionally designed a product that is addictive. They understand that if children stay online, they make more money. It doesn’t matter how harmful the material is.
Sharyn Alfonsi: So the fact that these kids ended up seeing the things that they saw, that were so disturbing, wasn’t by accident; it was by design?
Matt Bergman: Absolutely. This is not a coincidence.
Sharyn Alfonsi: Isn’t it the parents’ job to monitor this stuff?
Matt Bergman: Well, of course it is. I’m all for parental responsibility. But these products are explicitly designed to evade parental authority.
Sharyn Alfonsi: So what needs to be done?
Matt Bergman: Number one is age and identity verification. You know, that technology exists. You know, if people are trying to hook up on Tinder there’s technology to make sure that the people are who they say they are. Number two would be turn off the algorithms. You know, there’s no reason why Alexis Spence, who was interested in exercise, should have been directed toward anorexic content. Number three would be warnings so that parents know what’s going on. Let’s be realistic, you’re never gonna have social media platforms be 100% safe. But, you know, these changes would make them safer.
Right now, the Roberts family says social media is not safe for kids. Englyn Roberts was the baby in a large family, the center of her parents Toney and Brandy’s world.
Toney Roberts: She made every day…
Brandy Roberts: Special.
Toney Roberts: Every day felt like Christmas here.
But Englyn’s life online told a different story. As the pandemic played out, Englyn wrote about struggles with self-worth, relationships and mental health.
One August night in 2020, just a few hours after Toney and Brandy kissed their 14-year-old smiling daughter goodnight, Brandy received a text from a parent of one of Englyn’s friends who was worried about Englyn and suggested they check on her.
Toney Roberts: We went upstairs, and we checked, and her door was locked. That was kinda odd, so I took the key from the top and we opened the door and no Englyn. And when I turned around that’s when I found her. When you find your child hanging, and you are in that moment in disbelief. It’s just no way. Not our baby. Not our child. And then ultimately, I fault myself.
Sharyn Alfonsi: Why do you fault yourself?
Toney Roberts: Because I’m dad. I’m supposed to know.
Sharyn Alfonsi: Prior to that night you had no idea that she was depressed?
Toney Roberts: Not. Not even close.
Like the Spence family, Toney Roberts started connecting the dots after the Facebook papers came out and began digging through his daughter’s phone for answers. He found an Instagram post sent to Englyn from a friend.
Toney Roberts: There was a video. And that video was a lady on Instagram pretending to hang herself, and that’s ultimately what our child did. Cause, you ask yourself, how did she come up with this idea? And then when I did the research, there it was. She saw it on Instagram. It was on her phone.
Brandy Roberts: If that video wasn’t sent to her, because she copied it, she wouldn’t have had a way of knowing how to do that certain way of hanging yourself.
Nearly a year and a half after Englyn’s death, that hanging video was still circulating on Instagram, with at least 1,500 views. Toney Roberts says it was taken down in December 2021. The Roberts are suing Meta, the parent company to Instagram.
Toney Roberts: If they so call monitor and do things, how could it stay on that site? Because part of their policies says they don’t allow for self-harm photos, videos, things of that nature. So, who’s holding them accountable?
Meta declined our request for an interview, but it’s global head of safety gave us this statement – telling us, “we want teens to be safe online” and that Instagram doesn’t “allow content promoting self-harm or eating disorders,” and that Meta has improved Instagram’s “age verification technology.”
But when 60 Minutes ran this test two months ago, our colleague was able to lie about her age and sign up for Instagram as a 13-year-old with no verifications. We were also able to search for skinny and harmful content. And while a prompt came up asking if we wanted help, we instead clicked see posts and easily found content promoting anorexia and self harm — showing more rigorous change is needed, a challenge the Spence and Roberts families are ready for.
Kathleen Spence: We’re being gaslighted by the big tech companies that it’s our fault. When really what we should be doing as parents is banding together and say, “No. You need to do better. I’m doing everything I can. You need to do better.”
Brandy Roberts: We’ve lost, we’ve learned, but what’s gonna stop these companies from continuing to let things happen if they don’t change or be forced to make a change?
Toney Roberts: Social media is the silent killer for our children’s generation. That’s the conclusion I’ve come to. Why is everyone in power that can help change this, why is it not changing quick–enough? If our children are truly our future, what’s the wait?
RESPONSES FROM META, SNAPCHAT, AND TIKTOK
Statement from Meta
“We want teens to be safe online. We’ve developed more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram, and age verification technology that helps teens have age-appropriate experiences. We automatically set teens’ accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks. We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us. We’ll continue to work closely with experts, policymakers and parents on these important issues.” – Antigone Davis, Vice President, Global Head of Safety, Meta
Statement from Snapchat Global Head of Platform Safety Jacqueline Beauchere
“The loss of a family member is devastating, and our hearts go out to people facing these tragedies, no matter the circumstances. We designed Snapchat to be different from traditional social media, built around visual messaging between real friends and avoiding the most toxic features that encourage social comparison and can take a toll on mental health. We know that friendships are a critical source of support for young people, especially when dealing with mental health challenges, and we continue to work with leading experts on in-app tools and resources to support our community – especially those who may be struggling.”
TikTok only provided background information and declined to provide statement in response to our story.
Produced by Ashley Velie. Associate producers, Jennifer Dozor and Elizabeth Germino. Edited by April Wilson.
Leave a Reply