Children urged to speak out against harmful online content

[ad_1]

social media

More children need to speak out against harmful content online, Ofcom says in its first campaign as the new regulator of social media companies.

His call to children and young people to put pressure on social media companies to remove harmful material follows its research showing that only a sixth have filed complaints about it.

And this despite the fact that more than two-thirds of people aged 13 to 24 have encountered potentially dangerous content ranging from bullying, abusive behavior and threats to scams, misinformation and angling. dragged.

This is Ofcom’s first move to get social enterprises to clean up their act before the government’s online security bill gives it the power to start investigating all social media companies for possible violations. Its regulatory scope currently only extends to video-sharing sites.

He will have powers to impose fines on social media companies up to 10% of their worldwide revenue and shutting down services if they fail to remove harmful content. It may also sue social media officials who fail to report information or cooperate with investigations.

Evidence of failure to remove content even after being alerted by users is likely to strengthen the case for Ofcom to take strong action.

Campaign designed to empower young people

Anna-Sophie Harling, Head of Online Safety at Ofcom, said: “As we prepare to take on our new role as online safety regulator, we are already working with video sites and apps to ensure that they take steps to protect their users. harmful content.

“Our campaign is designed to empower young people to report harmful content when they see it, and we’re ready to hold tech companies accountable for how effectively they respond.”

To help galvanize more young internet users to report potentially harmful content, Ofcom has teamed up with social media influencer Lewis Leigh and behavioral psychologist Jo Hemmings to launch the ‘Only Nans’ campaign.

The campaign aims to reach young people on the sites and apps they use regularly to highlight the importance of reporting posts they may find harmful.

While 67% said they encountered harmful content online, only 17% took action to report it.

More than one in five (21%) said they didn’t think reporting it would make a difference, while 29% said they saw no need to do anything. One in nine (12%) said they did not know what to do or who to tell.

Exposure to harmful content can desensitize children

Ms Hemmings said: “With young people spending much of their time online, exposure to harmful content can unknowingly desensitize them to its harmful impact.

“People react very differently when they see something dangerous in real life – report it to the police or seek help from a friend, relative or guardian – but often take very little measures when they see the same thing in the virtual world.

“What is clear from the research is that while potential harm experienced once may have little negative impact, when experienced repeatedly, those experiences can cause significant harm.

“Worryingly, almost a third of 13-17 year olds didn’t report potentially harmful content because they didn’t consider it serious enough to do anything about. This risks a potentially serious issue going unchallenged.

Mr Leigh rose to national prominence as a social influencer during lockdown with his viral TikTok videos showing him teaching his grandmother, Phyllis, dance moves.

“My generation grew up using social media and that’s how I make a living. So while it’s mostly a positive experience and a place to bring people together and build communities, harmful content is also something I encounter all the time,” he said. .

“That’s why it was important to team up with my lovely Nan for this campaign to raise awareness of what we can do to protect ourselves online.”

[ad_2]