Summary
The UK media regulator Ofcom is investigating the messaging app Telegram over concerns that child sexual abuse material (CSAM) is being shared on its platform. Telegram denies the accusations, saying it has worked to eliminate such content since 2018 using advanced technology and cooperation with organizations.
Key Facts
- Ofcom is probing Telegram after evidence suggested CSAM was present on the app.
- UK law requires online services to prevent and tackle illegal content like CSAM or face large fines.
- Telegram claims it has nearly eliminated public sharing of CSAM since 2018 using detection algorithms.
- Ofcom's investigation follows contact from the Canadian Centre for Child Protection.
- Ofcom is also investigating other chat services over risks like child grooming.
- The Online Safety Act, effective from March 2025, forces platforms to address illegal content.
- Ofcom can fine companies up to £18 million or 10% of global revenue for breaking rules.
- Child protection groups like NSPCC support Ofcom’s actions against online abuse.
This is a fact-based summary from The Actual News. Click below to read the complete story directly from the original source.