Skip to main content
Advertisement
Advertisement

World

UK starts online checks to stop children accessing harmful content

UK starts online checks to stop children accessing harmful content
This combination of photos shows logos of Twitter, top left; Snapchat, top right; Facebook, bottom left; and TikTok. A bipartisan group of senators on Wednesday, April 26, 2023, introduced legislation aiming to prohibit all children under the age of 13 from using social media and would require permission from a guardian for users under 18 to create an account. (Photo: AP/File)

LONDON: New UK age verification rules aimed at preventing children from accessing harmful online content came into effect on Friday (July 25), with campaigners calling the move a long-overdue breakthrough in regulating the internet.

The rules, part of the 2023 Online Safety Act, require websites and apps to verify users’ ages using tools such as facial imagery or credit cards. Britain's media regulator Ofcom will oversee enforcement.

About 6,000 pornography websites have agreed to apply the measures, according to Ofcom chief executive Melanie Dawes, who said other platforms must also ensure children are protected from illegal content, including pornography, hate speech and graphic violence.

“We’ve done the work that no other regulator has done,” Dawes told BBC Radio. “These systems can work. We've researched that.”

Ofcom estimates that 500,000 children aged between eight and 14 viewed pornography online last month alone.

STRICTER RESPONSIBILITIES FOR TECH FIRMS

The new measures target content related not just to pornography but also suicide, self-harm, eating disorders, and other risks. Tech companies now have legal duties to safeguard minors and adults online or face sanctions.

Firms that violate the rules could be fined up to £18 million (US$23 million) or 10 per cent of their global revenue, whichever is higher, according to the UK government. Senior executives could also face criminal charges for failing to comply with Ofcom’s data requests.

After a preparatory period for the industry and regulator, the rules now take full effect.

Children will “experience a different internet for the first time,” said Technology Secretary Peter Kyle. Speaking to Sky News, he said he had “very high expectations” for the changes.

In a separate interview on parenting website Mumsnet, Kyle apologised to young people who have already been exposed to harmful content.

“I want to apologise to any kid who’s over 13 who has not had any of these protections,” he said.

FURTHER PLANS UNDERWAY

Rani Govender of child protection charity NSPCC called the changes “a really important milestone,” saying it was right for tech companies to be held accountable.

“Children are frequently stumbling across this harmful and dangerous content,” she told BBC News. “There will be loopholes, but it’s still right that we’re introducing much stronger rules to make sure that that can’t continue to happen.”

Prime Minister Keir Starmer’s government is also weighing additional rules, including a proposed daily two-hour limit on social media for children under 16.

Kyle said more details about regulations for younger users would be announced “in the near future.”

Source: AFP/fs
Advertisement

Also worth reading

Advertisement