Ofcom Launches Investigation into Telegram Over Child Safety Concerns
The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared.
Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.
Under current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content or risk enormous fines for breaches.
Telegram stated it categorically denies Ofcom's accusations and claims it has virtually eliminated the public spread of CSAM through advanced detection algorithms and collaboration with non-governmental organizations since 2018.
The company expressed surprise over the investigation, linking it to a broader offensive against online platforms that prioritize free speech and privacy.
This probe is part of Ofcom's wider crackdown on services suspected of violating the UK’s stringent online safety regulations, including stricter mandates for tech firms to address CSAM, which is illegal in the UK.
Ofcom's Director of Enforcement, Suzanne Cater, emphasized the devastating impact of child sexual exploitation and abuse on victims and underscored the need for sites and apps to combat such issues as a top priority.
Cater noted, while smaller services like file-hosting platforms have made progress, larger platforms like Telegram must also enhance their safety measures.
The investigation was partially prompted by the Canadian Centre for Child Protection's report of alleged CSAM sharing on Telegram. Similar concerns have led Ofcom to investigate Teen Chat and Chat Avenue over potential grooming risks.
Rani Govender from the NSPCC welcomed Ofcom's actions, highlighting the alarming rate of child sexual abuse image offenses reported by police, advocating for robust measures to tackle the situation.
Ofcom possesses the authority to impose fines up to £18 million or 10% of a company's global revenue for non-compliance with safety regulations, reinforcing its commitment to protect children online.
As debates continue regarding the balance between online safety and freedom of expression, the outcome of this investigation might influence future tech regulations in the UK.






















