UK starts online checks to stop children accessing harmful content

BSS
Published On: 25 Jul 2025, 19:14

LONDON, July 25, 2025 (BSS/AFP) - New UK age verification measures to prevent children accessing harmful online content came into force on Friday, with campaigners hailing them a "milestone" in their years-long battle for stronger regulations.

Under the new rules, to be enforced by Britain's media watchdog, websites and apps hosting potentially harmful content will be held responsible for age checks using measures such as facial imagery and credit cards.

Around 6,000 pornography sites have agreed to implement the curbs, according to Melanie Dawes, chief executive of British regulator Ofcom.
Other platforms such as X, which is facing a dispute over similar restrictions in Ireland, must also protect children from illegal pornographic, hateful and violent content, she noted.

"We've done the work that no other regulator has done," Dawes told BBC Radio.

"These systems can work. We've researched that," she said.
Around 500,000 youngsters aged eight to 14 encountered pornography online last month, according to Ofcom.

The long-awaited new rules, which aim to prevent minors from encountering content relating to suicide, self-harm, eating disorders as well as porn, stem from a 2023 Online Safety Act.

It imposes legal responsibilities on tech companies to better safeguard children and adults online and mandates sanctions for those who fall short. 

Rule-breakers face fines of up to o18 million ($23 million) or 10 percent of their worldwide revenue, "whichever is greater", according to the government.

Criminal action can also be taken against senior managers who fail to ensure companies follow Ofcom information requests.
The measures are coming into force now after the sector and the regulator were given time to prepare.

- 'Different internet' -
Children will "experience a different internet for the first time," technology secretary Peter Kyle told Sky News, adding he had "very high expectations" for the changes.

In an interview with parenting forum Mumsnet, he also said sorry to youngsters who had been exposed to harmful content.

"I want to apologise to any kid who's over 13 who has not had any of these protections," Kyle said. 

Rani Govender, of the child protection charity NSPCC, said it was "a really important milestone that we're finally seeing tech companies having to take responsibility for making their services safe for children". 

Children are frequently "stumbling across this harmful and dangerous content," she told BBC News.

"There will be loopholes," Govender noted, insisting it was still "right that we're introducing much stronger rules to make sure that that can't continue to happen". 

Prime Minister Keir Starmer's government is also considering introducing a daily two-hour limit for children on social media apps. 

Kyle said he would announce more plans for regulating the sector for under-16s "in the near future". 

  • Latest
  • Most Viewed
Myanmar junta says no voting in dozens of constituencies
Australia, Papua New Guinea to sign 'historic' defence deal
Rubio to discuss Qatar aftermath, Gaza with Netanyahu
Fakhrul for inclusive efforts to build effective democratic state
Russia says drone intrusion in Romania a provocation' from Kyiv
CA confers 'Youth Volunteer Award' on 12 recipients
Padma Bridge tolling goes digital: no stops required for vehicles
70pc of recommendations made by reform commissions can be implemented by Dec: Asif Nazrul
Light to moderate rain, thundershowers likely in parts of country
Shopping gains momentum in capital ahead of Durga Puja
১০