Skip to content

US social media laws to protect children create challenges for platforms


Social media platforms are struggling to navigate a patchwork of US state laws that require them to verify the age of users and give parents more control over their children’s accounts.

States including Utah and Arkansas have already approved social media for children legislation in recent weeks, and similar proposals have been made in other states, such as Louisiana, Texas and Ohio. The legislative efforts are designed to address fears that online platforms are harming the mental health and well-being of children and adolescents due to the increase in teen suicides in the United States.

But critics, including the platforms themselves, as well as some child advocacy groups, argue that the measures are poorly drafted and fragmented, potentially leading to a host of unintended consequences.

A senior staffer at a major tech company that drives its state legislative policy described the patchwork of proposals as “nightmarish [and] senseless, if not Kafkaesque”.

“Being able to prepare for this with confidence is a Herculean task,” said the person, describing it as an “engineering lift.” The person added that their legal teams were studying how to interpret the various rules and the associated risks.

There is a growing body of research linking the heavy use of social media from children and adolescents to mental health issues, prompting calls to better protect children from toxic content.

Republican Utah State Representative Jordan Teuscher, who was the state bill’s House sponsor, said it was created in response to a series of studies showing “some really devastating effects of social media on teenagers.”

“We strongly believe that parents know best how to take care of their children. It was parents who came to us saying ‘I need help,'” she said of the decision to introduce the legislation, which is expected to take effect in March 2024.

Utah law requires social media platforms to verify the age of all residents in the state and then obtain parental consent before allowing anyone under the age of 18 to open an account. Additionally, platforms must grant parents access to those accounts and cannot show them advertising or targeted content.

Governments and regulators around the world are racing to introduce legislation, with the UK’s Online Safety Bill and the EU’s Digital Services Act obliging social media companies to protect children from harmful content.

In the US, a new federal proposal, the Kids Online Safety Act, was introduced by US Senators Marsha Blackburn, a Republican, and Richard Blumenthal, a Democrat, which would impose a duty of care on platforms to keep children safe. Earlier this year, Republican Sen. Josh Hawley also introduced a bill that would impose a minimum age requirement of 16 for social media users.

Social media platforms and pundits agree that federal laws would be most effective in enforcing a uniform standard nationwide. But in the meantime, the smattering of emerging state laws has forced platforms to scramble to adapt.

States that have taken action on the issue have diverged in “two lanes,” said Zamaan Qureshi, co-chair of a youth coalition advocating safer social media for young people. In one, several Democratic-led states, such as California, have focused on regulation that aims to “force tech companies to make design changes to their products to better protect minors,” he said. In the other, more Republican states have focused on the role of parents.

A common theme among Republican state legislative efforts is the requirement for platforms to perform age verification for all users. This also paves the way for a second requirement in some states for platforms to obtain consent from a parent or guardian before allowing children under 18 to access their apps and, in some cases, to allow those parents to access their apps. your children’s accounts.

Given the lack of specificity in drafting the provisions, platforms have been puzzled over how to collect parental consent, according to multiple people familiar with the matter, weighing whether this could be a simple check-box exercise or will require companies to collect a copy of a birth certificate, for example.

Academics and advocacy groups have also raised questions about the freedom of speech and the privacy of children that the laws are designed to protect. And some state rules could leave LGBT+ children whose families don’t support them particularly vulnerable, Qureshi warned.

“What an active parent means is very different for every child or every young person,” she said.

The age verification mandate poses some big challenges for companies. Age checking, which typically involves asking for ID or using age estimation via face scanning technology, will result in underage users being removed from platforms, in turn affecting ad revenue. While identity is the primary method of verification, critics warn that not all minors have access to official identification. Also, age range estimation remains an inexact science.

For example, Arkansas, whose legislation goes into effect in September, has ordered platforms to use third parties to verify age, raising questions about whether there are enough tools to handle demand.

Yoti, a small UK provider of age-verification tech, is already being used by Meta’s Instagram and Facebook Dating, the company said. TikTok is also considering using the technology, according to two people familiar with the matter. One of the largest companies offering age-verification technology is MindGeek, which owns pornography sites Pornhub and RedTube, according to two tech staff members.

Meanwhile, social media platforms including Meta and Snap have begun touting the idea that age verification should be managed by the app stores where they’re downloaded or at the device level, such as on an Apple iPhone.

Meta said the company has already developed more than 30 tools for teens and families, including parental supervision tools. “We will continue to evaluate the proposed legislation and work with policy makers on these important issues,” the spokesman said.

Snap, which also developed parental controls, said it is in discussions with industry peers, regulators and third parties about how to address the age verification challenge. TikTok said it believed “industry-wide collaboration” was needed to address the issue.

However, some children’s advocacy groups argue that the goal of the legislation is misplaced. “The theme is to put it on parents and give more rights to more parents. . . He’s saying platforms don’t have to change,” said Josh Golin, executive director of the nonprofit Fairplay. “Actually, what we think we should be focusing on is making platforms safer and less exploitative of children.”


—————————————————-

Source link

🔥📰 For more news and articles, click here to see our full list.🌟✨

👍 🎉Don’t forget to follow and like our Facebook page for more updates and amazing content: Decorris List on Facebook 🌟💯