Protection of Children on Social Media Platforms
Nov 17,2023 |
By Kyaw Thein Lwin
In Today’s digital world, screen-based media devices and social media usage have become integral parts of children’s everyday lives. Children under the age of 18 use social media platforms such as Meta (Facebook), Messenger, Twitter, Instagram, TikTok, Snapchat and YouTube through screen-based media devices and the daily screen time rates of social media have increased significantly among children.
How does social media affect children?
Social media can have both positive and negative effects on children. On the one hand, it functions as a digital space for socialization, providing children with opportunities to connect with friends and family members, facilitating communication and connectivity. Besides this, it also serves as a platform for children to learn new ideas and information through educational resources, and it allows them to express their ideas and talents through posts, videos, and other forms of content creation.
On the other hand, children have limited life experience, which may pose challenges in discerning what is good and bad, and their lack of experience can also have a negative effect on them. Some of these effects include Cyberbullying, Online enticement, Sextortion, Identity theft and Screen addiction, and more.
What is happening regarding children’s use of social media?
The 2022 survey conducted by the Pew Research Centre reveals that 46 per cent of children aged 13 to 17 have been bullied or harassed online, and Facebook was identified as the source of cyberbullying for 75 per cent of online harassment victims in the United States in 2021. Cyberbullying has a significant impact on children’s emotional well-being and can also affect their academic performance.
An estimated 500,000 online predators are active each day. They contact children through social media platforms and gaming websites, coercing them into producing self-generated child abuse images. Children aged 12 to 15 are at risk of being victimized through grooming or manipulation by adults they meet online, and 82 per cent of online predatory incidents occur through communication on social media platforms.
The social media platform has become a commercial landscape, with 77 per cent of small businesses using it to connect with their customers, and the market is expected to reach a value of $207 billion in 2023. Consequently, children now have the ability to order products with a simple click without consulting their parents. This raises concerns about the authenticity and legitimacy of the advertised items.
Do we need a law to regulate children’s use of social media platforms?
All major social media platforms have implemented their own self-regulatory measures to govern the content and user behaviour on their own platforms, including community guidelines, advertising standards, violation detection, age restrictions, privacy settings, reporting abuse (such as nudity, hate speech and threats) and taking appropriate actions. Nevertheless, the question remains: Can these self-regulatory measures effectively protect children on their platforms?
On 4 October 2021, the whistleblower Frances Haugen, who had worked as a product manager at tech companies, including Facebook, testified before the United States Senate Committee on Commerce, Science and Transport. She revealed that although social media platforms implement self-regulatory measures, there are several weaknesses and criticisms associated with these efforts.
According to the findings report of the UK Parliament’s Science and Technology Committee, which was released on 31 January 2019, social media companies should have a formal legal duty of care to their users under the age of 18. The existing patchwork of regulations is seen as inadequate and insufficient in creating a safe online environment for children. The report also recommends the government enact new legislation to establish a statutory code of practice for social media companies, thereby providing consistent guidelines for content reporting practices and moderation mechanisms.
What specific provisions are included in the new social media laws?
On 23 March 2023, the governor of Utah signed two new bills, HB152 and SB311, to protect children from using social media platforms. Under HB152, social media companies are obligated to verify the age of a Utah resident who seeks to maintain or open a social media account. Additionally, they are required to obtain the consent of a parent or guardian before allowing a Utah resident under the age of 18 to maintain or open an account. The bill also prohibits social media companies from allowing a Utah resident to open an account if that person does not meet age requirements under state or federal law. Furthermore, it imposes specific obligations on certain social media companies regarding accounts held by Utah children. These obligations include prohibiting direct messaging with certain accounts, excluding the children’s account from search results, refraining from displaying advertising, abstaining from collecting, sharing, or using personal information from the account (with certain exceptions), avoiding the targeting or suggestion of ads, accounts, or content, and imposing limitations on access hours, which can be overridden by parental or guardian direction. In addition, the bill requires social media companies to restrict Utah children’s account access between the hours of 10:30 pm and 6:30 am, and it requires these companies to grant parents or guardians access to the content and interactions of accounts held by Utah residents under the age of 18.
Under HB311, the bill prohibits a social media company from using a design or feature that causes a minor to have an addiction to the company’s social media platform. Moreover, the bill authorizes a private right of action to collect attorney fees and damages from a social media company for harm incurred by a minor’s use of the company’s social media platform.
On 12 April 2023, the governor of Arkansas signed the Social Media Safety Act (SB396). The bill is similar to Utah’s HB152 and imposes certain requirements on social media companies operating within the state. The bill states that social media companies shall not permit Arkansan users who are minors to be account holders on a social media company’s social media platform unless the minor has the express consent of a parent or legal guardian. Additionally, the companies are required to implement age verification for the use of social media.
Conclusion
The self-regulatory measures implemented by social media companies are insufficient to fully safeguard children. It is crucial for us to comprehend the benefits and risks associated with online opportunities for children. Therefore, the author suggests that social media companies, policymakers and parents need to collaborate to protect our children from the negative effects of social media.
Reference:
(1) SB311 Social Media Regulation Amendments, Utah State Legislature
(2) The Social Media Safety Act (Senate Bill 396),
(3) Online Predators, Child Crime Prevention & Safety Centre
No comments:
Post a Comment