Friday, March 24, 2023

NOTHING TO DO WITH  NATIONAL SECURITY
Utah governor signs bill to curb children’s social media use

The sweeping restrictions aim to curtail kid and teen use of social media apps such as Instagram and TikTok


By Naomi NixCat Zakrzewski and Heather Kelly
WASHINGTON POST
Updated March 24, 2023 

Utah Gov. Spencer Cox (R) signed two bills into law Thursday that would impose sweeping restrictions on kid and teen use of social media apps such as Instagram and TikTok — a move proponents say will protect youth from the detrimental effects of internet platforms.

One law aims to force social media companies to verify that users who are Utah residents are over the age of 18. The bill also requires platforms to obtain parental consent before letting minors use their services, and guardians must be given access to their child’s account. A default curfew must also be set.

The Utah regulations are some of the most aggressive laws passed by any state to curb the use of social media by young people, at a time when experts have been raising alarm bells about worsening mental health among American adolescents. Congress has struggled to pass stricter bills on online child safety despite bipartisan concern about the effects social media has on kids.

The two bills previously passed in Utah’s state legislature.

Americans have concerns over TikTok, poll finds

“We’re no longer willing to let social media companies continue to harm the mental health of our youth,” Cox tweeted Thursday. “Utah’s leading the way in holding social media companies accountable — and we’re not slowing down anytime soon.”

The bill’s passage coincided with TikTok CEO Shou Zi Chew’s first appearance before Congress, during which he faced extensive grilling by lawmakers who say they are worried that the extraordinarily popular video app is hurting the welfare of children. They also said the company represented a national security threat because it is owned by Beijing-based ByteDance.

Tech companies have been facing increasing scrutiny by lawmakers and advocates over the effect of their services on adolescents. Last year, California state lawmakers passed the California Age-Appropriate Design Code Act, which requires digital platforms to vet whether new products may pose harm to minors and to offer privacy guardrails to younger users by default. But the tech industry group NetChoice sued to block the law, arguing that it violates the First Amendment and that tech companies have the right under the Constitution to make “editorial decisions” about what content they publish or remove.

Efforts to bolster federal rules governing how tech companies handle minors’ data and protect their mental and physical safety have stalled. Late last year, Senate lawmakers attempted to urge Congress to pass new online privacy and safety protections for children as part of an omnibus spending package.

Under the new Utah measures, tech companies must block children’s access to social media apps between 10:30 p.m. and 6:30 a.m., although parents would be allowed to adjust those limits. The platforms also must prohibit direct messaging by anyone the child hasn’t followed or friended, and they must block underage accounts from search results.

The Utah restrictions additionally bar companies from collecting children’s data and targeting their accounts with advertising. The effort also attempts to prohibit tech companies from designing features in their services that would lead to social media addiction among kids.

Privacy advocates say the bills go too far, and could put LGBTQ children or kids living in abusive homes at risk.

“These bills radically undermine the constitutional and human rights of young people in Utah, but they also just don’t really make any sense,” said Evan Greer, director of digital advocacy group Fight for the Future. “I’m not sure anyone has actually thought about how any of this will work in practice. How will a tech company determine whether someone is someone else’s parent or legal guardian? What about in situations where there is a custody battle or allegations of abuse, and an abusive parent is attempting to obtain access t a child’s social media messages?

Common Sense Media, a media advocacy group for families, has a mixed reaction to Thursdays news. In a statement on its site, the group says it supports one of the laws Utah passed, HB 311, which requires design changes to protect minors. The group does not support the second law, SB 152, which gives parents monitoring capabilities and requires parental consent to create social media accounts.

“Unfortunately, Governor Cox also signed SB 152 into law, which would give parents access to their minor children’s posts and all the messages they send and receive. This would deprive kids of the online privacy protections we advocate for.”

Industry groups have signaled that they have First Amendment concerns about the rules. NetChoice vice president and general counsel Carl Szabo said the group was evaluating next steps on the Utah law and was talking to other allies in the tech industry.

“This law violates the First Amendment by infringing on adults’ lawful access to constitutionally protected speech while mandating massive data collection and tracking of all Utahns,” Szabo said. In the past, NetChoice has teamed up with industry groups to challenge social media laws in Florida and Texas.

Social media platforms have been increasingly facing scrutiny for exposing young people to toxic content and dangerous predators. Earlier this year, the Centers for Disease Control and Prevention found that nearly 1 in 3 high school girls reported in 2021 that they seriously considered suicide — up nearly 60 percent from a decade ago. And some experts and schools argue that social media is contributing to a mental health crisis among young people.

It’s unclear how tech companies would be able to enforce the age restrictions on their apps. The social media companies already bar children under the age of 13 from using most of their services, but advocates, parents and experts say kids can easily bypass those rules by lying about their age or using an older person’s account.

Tech companies such as Meta, TikTok and Snapchat have also increasingly been tailoring their services to offer more parental control and moderation for minors.

Meta global head of safety Antigone Davis said in a statement that the company has already invested in “age verification technology” to ensure “teens have age-appropriate experiences” on its social networks. On Instagram, the company automatically set teens’ accounts to private when they join and send notifications encouraging them to take regular breaks.

“We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us,” Davis said. “We’ll continue to work closely with experts, policymakers and parents on these important issues.”

Snap declined to comment.

Utah Law Could Curb Use of TikTok and Instagram by Children and Teens

The law, which prohibits social networks from allowing minors to have accounts without parental consent, may come as welcome news to many families even as it raises privacy concerns.


In January, Lucy Loewen, 13, testified in a hearing in Salt Lake City about social media use among youths.
Credit...Jeffrey D. Allred/Deseret News

By Natasha Singer
NEW YORK TIMES
Natasha Singer, a technology reporter, has covered children’s privacy since 2012.
March 23, 2023

Gov. Spencer J. Cox of Utah signed a sweeping social media bill on Thursday afternoon that could dramatically limit youth access to apps like TikTok and Instagram, potentially upending how many minors in the state use the internet.

The Utah Legislature passed the measure this month, despite opposition from tech industry groups and civil liberties experts. It is the first state law in the nation that will prohibit social media services from allowing users under 18 to have accounts without the explicit consent of a parent or guardian.

The new measure will also require social networks to give Utah parents access to their children’s posts, messages and responses. And it will require social media services to block Utah minors from accessing their accounts from 10:30 p.m. to 6:30 a.m., a default setting that only a parent or guardian will be able to modify.

Michael K. McKell, a Republican member of the Utah Senate who sponsored the bill, said the statute was intended to address a “mental health crisis” among American teenagers as well as protect younger users from bullying and child sexual exploitation.

“We think social media is a contributing factor,” Senator McKell said in a phone interview on Thursday. “We want to tackle that issue.”

While the measure may come as welcome news for many parents, civil liberties experts and tech industry groups said it raised significant privacy and free speech concerns. Some warned that the new law, which will require social networks to verify users’ ages and obtain parental consent for those under 18, could cut off young people in Utah from major online platforms and infringe on parental rights to decide how their children used the internet.

Governor Cox also signed a second bill on Thursday that will prohibit social media companies from employing features or design techniques that could cause a minor to form an “addiction” to their online platforms.

The Utah measures come at a moment of heightened public concern and political action over powerful social media algorithms that may entice young people to spend hours online.

Over the last few years, popular social networking services have come under scrutiny for recommending content on self-harm to young people and exposing children to predators. Instagram, TikTok and other companies have responded by increasing controls for parents, including time limits and messaging restrictions.

Efforts to minimize online risks to young people have attracted widespread, bipartisan support. In his State of the Union address last month, President Joe Biden called on Congress to pass legislation restricting how tech companies may track teenagers and children online.

State legislatures have already introduced a number of bills intended to limit mental health and safety risks that social networks, multiplayer video games and other online services may pose to some children and teenagers. Last year, California enacted a sweeping online safety law that will require many social networks, video games and other services to install the equivalent of seatbelts and airbags for younger users.

Among other things, the California measure will require such services to turn on the highest privacy settings by default for users under 18. It also requires social networks and other services to turn off features by default that could pose risks to younger people, like “friend finders” that allow adult strangers to contact children.

But the Utah law far outstrips the California online safety effort, imposing broad constraints and enabling parental surveillance that could alter how many teenagers in Utah use the internet. Sarah Coyne, a professor of child development at Brigham Young University, in Provo, Utah, warned that the measure could inadvertently boomerang, exacerbating youth mental health issues by cutting off vulnerable young people from important sources of information and support.

“We know that marginalized youth, such as L.G.B.T.Q. kids, use social media in some really important ways to find belonging and support, especially when they don’t have family support,” said Dr. Coyne, who has studied how time spent on social media affects adolescents.

“So if you’ve got a 17-year-old who is really struggling with mental health turning to social media to find a place to belong, and their parents are cutting it off or looking at their messages, that can have a really significant negative impact,” she said.

Senator McKell said that the bill was intended to help parents protect their children online and that potential benefits far outweighed potential drawbacks. In addition to requiring parental consent, the bill will prohibit social networks from allowing strangers to message young people, ban targeted advertising and limit companies’ collection and use of young people’s personal data.

“If a parent wants to give their kids free rein online, under our bill they are going to have the ability to do that,” Senator McKell said. “But we want parents to be involved in the process, and we’re not going to apologize for that.”

The Utah measure, which applies to social networks with at least five million account holders worldwide, is scheduled to take effect on March 1, 2024.

The Arkansas Legislature has introduced a similar bill that would require social network platforms to verify users’ ages and obtain explicit parental consent for people under 18.

 A bill introduced in Texas is even more stringent: It would ban social media accounts for minors.


Natasha Singer writes about technology, business and society. She is currently reporting on the far-reaching ways that tech companies and their tools are reshaping public schools, higher education and job opportunities. @natashanyt

A version of this article appears in print on March 24, 2023, Section B, Page 1 of the New York edition with the headline: Utah Limits Social Media For Youths.



The Rise of TikTok

 

The popular video app is changing social media, one quick video at a time.

If you’ve constantly heard TikTok mentioned by your friends (and children) but have been unsure how it works, this latecomer’s guide is for you.


The app is known for its viral dance challenges and catchy trends. But its algorithm has been blamed for amplifying misinformation and harmful content.


Sabrina Brier is finding success on TikTok by parodying the life of a 20-something in New York who’s trying to shed her basic suburban past.


Governments around the world have escalated efforts to ban TikTok. Here’s why.


In a corner of the platform sometimes known as #DatingTok, young women are finding comfort and cachet by turning dating mishaps into viral content.


As tech workers continue to get laid off, some have turned to TikTok to vent about their former employers, expose perceived wrongdoings and find their next gig.


A new vocabulary has emerged on TikTok as creators try to get around algorithms and strict content moderation. They call it algospeak.

No comments:

Post a Comment