Strange IndiaStrange India


nearly every Congressional hearing on Big Tech—whether about data privacy, monopolies, or in the case of last week’s TikTok hearing, national security—eventually features one or more lawmakers bemoaning something along the lines of, “But think of the children!” 

At the recent hearing, several representatives, including Democrat Frank Melone of New Jersey, cited research that shows TikTok pushes content harmful to children and teens. A new research paper from the Center for Countering Digital Hate found that the platform pushes content about self-harm and eating disorders to children and teens at a rate of every 2.6 minutes and every eight minutes, respectively. And the concern makes sense: TikTok is the platform of choice for many young users. A 2022 Pew Research study found that 67 percent of teens surveyed said they used the app, second only to YouTube. 

“Without legally mandated safety by design, transparency, and accountability, the algorithm will continue to put vulnerable users at risk,” Callum Hood, head of research at the Center for Countering Digital Hate, said in a press statement. “Congress owes it to America’s parents today to get answers.”

But as TikTok CEO Shou Zi Chew noted, these are issues that nearly every major social media platform has faced in recent years. Many of the concerns echo criticisms levied at Meta in its previous hearings, particularly around Instagram. In 2021, following the leak of the Facebook Papers by whistleblower Frances Haugen, Democratic senator Richard Blumenthal read aloud a text message from a constituent that described his 15-year-old daughter’s struggle with body image and placed blame on Instagram. A 2022 report from Fair Play for Kids found that Instagram was rife with “pro-eating disorder bubbles,” or connected groups of accounts that promote eating disorders. The report estimated that one out of every 75 users followed at least one of these accounts. But the challenges that have become popular on TikTok are not limited to the platform itself. In 2018, children across the country were posting videos of themselves eating Tide Pods on Facebook, YouTube, and other platforms.

Talking about the harms platforms can have on children often feels less like genuine concern and more like an attempt to capture attention by focusing on some of the most salient fears for American parents. Focusing on young users also provides one of the only clear avenues for bipartisan cooperation–what monster doesn’t want to make sure children are protected from exploitation and harmful content?

Yet just 24 hours before Chew sat under congressional questioning, students at Denver’s East High School fled their classrooms during yet another school shooting. Earlier this year, a pandemic-era program offering free school lunches for all children expired, reverting to an income-based system that will introduce more barriers for children who need it the most. Nearly one-third of children in the US live in poverty, largely thanks to deeply entrenched issues of economic inequality and an eroding social safety net.

A lack of gun safety laws, an unwillingness to fund education or social programs—these things affect children, yet in many cases legislation and discussion around these issues ends in gridlock. And imploring legislators to “think of the children” rarely moves the needle. Where Big Tech is concerned, the emphasis on “the children” often simplifies and detracts from the thornier issues of data privacy, rampant data collection, the outsized power of certain companies to dominate smaller competitors, and the cross-border nature of extremist content and misinformation. Instead, we need to be asking deeper questions: How long should companies be able to keep data? What should it be used for? Can private companies seeking to cultivate the next generation of consumers ever be incentivized to set time limits for young users or limit access to content? How are our systems at large enabling harms?

There are ways to emphasize children’s welfare that would actually protect them, but these rarely gain traction in Congress. While representatives may express concern that TikTok in the US differs greatly from its Chinese equivalent Douyin in its experience for young users, in the five years since the Tide Pod challenge, or even the 18 months since Frances Haugen first testified in front of Congress, there has been little movement on legislation to address the online harms US children face, even as they regularly feature in televised hearings. A 2021 bipartisan bill introduced by senators Edward J. Markey and Bill Cassidy would prevent tech companies from collecting the data of users between the ages of 13 and 15 and would establish a Youth Marketing and Privacy Division at the Federal Trade Commission. That bill has yet to see a vote on the Senate floor.

Every social problem—tech-based or otherwise—has adverse consequences for children. The question is, how dedicated are lawmakers to solving those issues? And how much are they simply using young people (many of whom don’t want a TikTok ban) to simplify the conversations around much more complex problems.



Source link

By AUTHOR

Leave a Reply

Your email address will not be published. Required fields are marked *