Plans for Instagram Kids Put on Hold as Controversy Erupts over Instagram’s Harmful Effects

Earlier this spring Instagram announced plans to develop a social media platform for children under thirteen. Although Instagram does not allow children under the age of thirteen to use its app, many young children lie about their age to make an account.

Instagram has faced criticism in the past for the risks facing children on its app. Critics have cited the potential for bullying, predation, and body image issues. In response, the company has acknowledged a need to protect its youngest users—and developed plans for this new app, Instagram Kids.

However, last month a bombshell brought Instagram Kids’ development to a halt. Instagram’s parent company, Facebook, learned from its own internal studies that Instagram has harmful effects on many of its users, especially teenage girls. A Facebook employee, Frances Haugen, released documents showing that Facebook had performed internal studies and determined that Instagram harmed its users’ mental health, to a degree unmatched by any other social media platform. Furthermore, the released documents allege that Facebook’s leadership was aware of these effects and did nothing to reduce or acknowledge the harm; the company has denied the allegations and says it has had to balance competing desires of its users to improve its services. In response to public outcry over this information, Facebook hastily put a pin in its plans for Instagram Kids.

These revelations sparked an investigation on Capitol Hill, where furious lawmakers are searching for ways to protect Instagram users. Proposed solutions range from regulating Instagram’s algorithms to repealing Section 230 liability protection—which protects Instagram from liability for harm caused by posts that its algorithms promote. Instagram’s algorithms are a large part of the trouble. Instead of showing users content in a chronological order, they promote content that is more likely to elicit an reaction such as taking a closer look at the image (called “engagement-based ranking”). As a result, users can become caught in a pattern of being shown images that engage them—even if the engagement is negative. This has led to problems such as body-image issues in teenagers, who can be bombarded with edited images of extremely fit people because these images elicit reactions at a high rate.

Instagram’s business model has made great use of these algorithms to make the app one of the most popular social media platforms. Because the app is free, Instagram makes money by selling its users’ data. Thus, maximizing the number of interactions that users have with content gives Instagram more valuable data to use.

The current problem is that any proposed actions run afoul of one party’s separate agenda regarding social media platforms. This scandal comes at a time when social media platforms are under intense pressure from both sides. The left wants a crackdown on misinformation, while the right wants to ensure protections for free speech and limit Big Tech’s censorship powers. Both sides can agree that protecting children from mental health crises is a worthy goal, but any proposal will favor one side’s other goals regarding social media. For example, eliminating Section 230 would favor the Democrats’ plans, as it would give social media an incentive to remove any content that might be harmful, including the kind of content that Republicans want to protect.

So what does this mean for Instagram Kids? The same algorithms that can cause mental health problems make the Instagram fun and addictive. Despite facing strong opposition from parents and lawmakers, Instagram has every incentive to use similar algorithms to get more valuable data from younger users.

Parents, with full knowledge of the risks, will have to decide to what extent they are willing to let their children use Instagram’s service.

Unless public backlash or Congressional action severely limits Instagram’s ability to profit off children using social media, it is likely that Instagram Kids will move forward. This most recent revelation should alert parents to the potential harm to children and teenagers should.

The Children’s Online Privacy and Protection Act imposes additional requirements for online platforms that collect personal information from users under thirteen—one of these requirements is that they must obtain verifiable parental consent before collecting any personal information. Parents still have the power to protect their children from some of the harmful effects of social media, regardless of how Congress reacts to this most recent scandal. No matter what regulations or laws are in place, Instagram will always have an incentive to profit off their users, regardless their service’s effects on their mental health. Parents, with full knowledge of the risks, will have to decide to what extent they are willing to let their children use Instagram’s service.

John Gray

John Gray was born in Boston, Massachusetts. He came to UNC Chapel Hill for college and liked it so much he stuck around for law school. In his spare time, he likes to research the history of Roman Law.