Child Safety on Social Media – Why should we be concerned?

ISTD
DATE
28 Apr 2023

During a recent TikTok hearing in the United States Congress, several members expressed concern about the platform’s promotion of harmful content to children and teens. Frank Melone, a congress member from New Jersey, cited research indicating that TikTok has been pushing content related to self-harm and eating disorders to young users. However, TikTok is not the only social media platform facing such allegations. Government agencies and NGOs have also called on other platforms, including Facebook, YouTube, and Instagram, to take on more proactive measures to safeguard children using their services.

Online Harm Against Children

Virality gone wrong – TikTok is a social media platform that has exploded in popularity in recent years, particularly among the youth. While the app offers a platform for creative expression and entertainment, it has also been associated with a number of dangerous challenges that have gone viral on the platform. These challenges can often involve risky behavior, leading to serious injuries or even fatalities.

One of the most notorious challenges that originated on TikTok is the “skull breaker challenge,” which involves tripping someone in the middle of a jump, causing them to fall and potentially suffer head injuries. There have also been challenges involving the consumption of toxic substances, such as the “Tide Pod challenge,” which involves ingesting laundry detergent pods, and the “Benadryl challenge,” which involves taking excessive amounts of allergy medication.

Toll on mental wellness – Children may also be exposed to inappropriate content that can be damaging to their psychological well-being. Recently, the state of Arkansas (United Sates) sued TikTok and Facebook parent company Meta over claims that the social media companies had failed to take proper steps to protect minors using the platform from inappropriate content, including sexual content and content depicting drug and alcohol use.

Cyberbullying – One of the most significant potential child safety problems in social media platforms is cyberbullying. Cyberbullying is a form of online harassment where children and adolescents can be targeted and tormented by others. A study conducted by Sunlight Alliance for Action (AfA) found that nearly half of the 1,000 respondents – Singaporeans and permanent residents aged above 15 – had experienced some form of online harm such as being stalked online and cyberbullied.

Collective Measures From Government and Social Media

In response to these concerns, many social media platforms have introduced measures to promote responsible usage among their users, including children. For example, Facebook has introduced a “Youth Portal” designed to provide young users with information about how to use the platform safely and responsibly. Instagram has added features like “Restrict” to combat bullying, and a “Parent’s Guide” to educate parents on how to help their children use the app safely. TikTok has introduced the “Family Safety Mode,” which allows parents to control the amount of time their children spend on the app, and to restrict certain types of content. Social media companies have also developed technology to detect and remove harmful content on their platforms. These measures have been successful in some cases, but there is still room for improvement.

Governments around the world are also taking steps to advance regulation on child safety. For instance, the UK government has introduced the Online Harms Bill, which seeks to regulate social media platforms and hold them accountable for harmful content posted on their platforms. Similarly, the US government has proposed the Kids Internet Design and Safety (KIDS) Act, which seeks to regulate the collection of data from children and protect them from online predators.

Children’s Online Safety in Singapore

In Singapore, the government has also taken measures to safeguard children in the online space. The government has developed a comprehensive framework to promote the safe and responsible use of the internet. For instance, the Personal Data Protection Act regulates the collection, use, and disclosure of personal data, including that of children. The government has also established the Media Literacy Council, which promotes media literacy and digital citizenship.

However, there is still much work to be done to address the child safety problem in social media platforms. Social media companies need to be more proactive in detecting and removing harmful content on their platforms. Governments also need to do more to hold social media companies accountable for harmful content posted on their platforms.

The potential child safety problems in social media platforms are significant and require urgent attention. Social media companies and governments need to work together to promote a safe and responsible online environment for children. Singapore is already taking steps to address the issue, but there is still much work to be done. It is crucial that all stakeholders take a proactive approach to ensure that the online space is a safe place for children to explore and learn.


About the Author

Roy Ka-Wei Lee is an Assistant Professor at the Information Systems Technology and Design Pillar (ISTD) and Design and Artificial Intelligence (DAI) programme from the Singapore University of Technology and Design (SUTD). He leads the Social AI Studio at SUTD which aims to design next-generation socially-enabled AI systems.