© 2024 Northeast Indiana Public Radio
A 501(c)3 non-profit organization. Public File 89.1 WBOI

Listen Now · on iPhone · on Android
NPR News and Diverse Music
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Support for WBOI.org comes from:

Amid pressure, Meta says its rolling out protections for teenage users

ARI SHAPIRO, HOST:

Meta, the parent company of Facebook and Instagram, says it's making those apps safer for kids. This comes after growing pressure from all sides - parents, lawmakers, former employees, you name it. And the company is also fending off lawsuits. NPR tech correspondent Dara Kerr has the details. Hi, Dara.

DARA KERR, BYLINE: Hi, Ari.

SHAPIRO: What is Meta changing about how these apps work?

KERR: So this change is one of the biggest moves Meta has made to try and make Instagram and Facebook safer for kids. The company says that over the next few months, it'll start automatically restricting various types of content on teenagers' accounts. That means posts about suicide, self-harm and eating disorders. And this is for all people under the age of 18. So say you're a teen, and one of your friends posts something about self-harm on Instagram. Meta says its filters will automatically block you from seeing that post. And teens are also not going to be able to search for that type of content. And if they do, Meta says they'll be directed to resources for help.

SHAPIRO: A lot of questions about these announcements. I guess the biggest is, will this actually make things safer for kids online?

KERR: Yeah, it is really hard to solve things like this. On the one hand, it may restrict teens from seeing certain posts. But on the other hand, it could prevent them from knowing when to reach out to a friend in trouble. And today I've heard from a bunch of child safety advocates who think these changes still don't go far enough. There's a lot of toxic content on social media, so teens may still be vulnerable to bullying and seeing posts that aren't being captured by Meta's filters.

SHAPIRO: Does Meta even know who is a teen and who's not?

KERR: Well, you're supposed to put in your birth date when you sign up, but it's pretty easy for kids to lie about their age on Facebook and Instagram. Here's psychologist Jean Twenge.

JEAN TWENGE: Their parents might have no idea just the way it's set up because you don't need parental permission. You just check a box. Check a box saying that you're 13, or, you know, you choose a different birth year, and boom. You're on.

KERR: I talked to a Meta spokeswoman about this, and she acknowledged people can misrepresent their ages on the apps. And she said Meta is investing in age verification tools and technology to try and detect when people lie about their ages.

SHAPIRO: These apps have been around for years, and Meta has been criticized for these sorts of things almost as long. Why did it take until now for them to put these policies in place?

KERR: Yeah. Well, this year has been a particular doozy for Meta. We'd need a lot more time, Ari, to get into all of it, but it's fair to say Meta has been attacked on all fronts. Parent groups have rallied on Capitol Hill, and this is even an issue that's united conservatives and liberals in Congress. A bipartisan group of senators are pushing to pass legislation called the Kids Online Safety Act, which would hold social media companies accountable for feeding teens toxic content. And also, a new Meta whistleblower came forward with more information about what goes on inside the company.

SHAPIRO: A new whistleblower. But a couple years ago, we heard from another whistleblower that Facebook was aware its products harmed kids, right?

KERR: Yeah, yeah. In 2021, an initial whistleblower came forward. And then this past November, Arturo Bejar, whose job involved protecting Meta's users, went public with new internal documents. Those showed Meta hasn't stopped its algorithms from pushing harmful content to teens. And that's led to a massive lawsuit by 40 states alleging Meta's social media products are addictive, and that has fueled a mental health crisis for teens. So Meta's announcement today, which was in a blog post, may be a way to try and reckon with all of this pressure.

SHAPIRO: Thank you. That's NPR's Dara Kerr. And if you or someone you know is in an emotional crisis, reach out to the National Suicide and Crisis Hotline by dialing 988. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Dara Kerr
Dara Kerr is a tech reporter for NPR. She examines the choices tech companies make and the influence they wield over our lives and society.