United Kingdom
This article was added by the user . TheWorldNews is not responsible for the content of the platform.

Give Ofcom emergency powers to demand answers from Facebook, says children’s charity chief

Ofcom should be given emergency powers to investigate Facebook before it brings in its encryption plans as new Duty of Care laws will be too late, the NSPCC has warned.

Peter Wanless, the chief executive of the children’s charity, said the tech giant’s proposals could “wash away” children’s protections on its apps, as encryption means not even the company will be able to see what users are sending each other.

In an exclusive article for the Telegraph, below, Mr Wanless said Ofcom, which is due to become the online regulator under incoming Duty of Care laws, needs its investigation powers early to assess the scale of abuse happening on Facebook.

The tech company, now called Meta, responded to the call saying it has developed a “clear approach” to maintaining child safeguarding measures after encryption.

The row comes after Mark Zuckerberg, Meta’s 37-year-old chief executive, announced plans in 2019 to encrypt the company’s Messenger service as well as Instagram’s Direct Messages - both messaging apps attached to its Facebook and Instagram social networks.

The move provoked an outcry from police chiefs and children’s charities who warned it could hinder Meta’s ability to detect groomers, who often find children on social media before moving conversations to encrypted messaging apps where they are harder to detect.

They warned it could also threaten Facebook’s scanning software, which blocks known child abuse images from being uploaded to its apps.

Priti Patel, the Home Secretary, has previously condemned the plans, calling them “morally wrong and dangerous”.

Following months of mounting pressure, Meta announced last week it will not implement its encryption plans until 2023 at the earliest.

In the article, Mr Wanless said he feared the company was just “playing for time” and said Ofcom needed to be granted powers so an independent assessment could be made of its encryption plans.

He said: “Let’s give the regulator the powers to start asking necessary questions and the ability to look at the inner workings of Meta without delay.

“The ongoing encryption debate and whistle-blower revelations highlight that Meta can no longer be judge and jury over their own conduct while children’s safety sits on a cliff edge.”

Ofcom is due to get muscular powers to investigate tech companies as well as potentially levy fines running into the billions if users are found to come harm on their apps, under incoming duty of care laws that The Telegraph has campaigned for since 2018. Ministers are set to bring a bill before Parliament next year.

However, campaigners warn that Ofcom may not have its powers until 2024, by which time Meta’s encryption plans will likely be in place.

Following the NSPCC’s call, Antigone Davis, Global Head of Safety at Meta said: “We have no tolerance for child exploitation on our platforms. We agree on the need for strong safety measures that work with end-to-end encryption, and we have developed a clear approach for building these into our plans for end-to-end encryption.

“We’re focused on preventing harm from happening in the first place by restricting adults on Facebook and Instagram from messaging children and defaulting under 18s accounts to private or ‘friends only’.

“We also offer more controls for people to protect themselves from harm and respond swiftly to user reports and valid requests from the police.

“The overwhelming majority of Brits already rely on encryption to keep them safe from hackers, fraudsters and criminals, and any solutions we develop need to ensure those protections remain intact.  

“We’ll continue to work with outside experts to develop effective solutions for combating such abuse because our work in this area is never done.”

‘Meta can no longer be judge and jury over its own conduct while children’s safety sits on a cliff edge’

By Sir Peter Wanless, chief executive of NSPCC

It’s been almost two years since the NSPCC led a global coalition of 130 child protection organisations to write to Mark Zuckerberg.

We asked him to pause plans to roll out end-to-end encryption on Facebook and Instagram’s messaging services until they recognise that direct messaging is the frontline of child sexual abuse and prove they have systems in place to disrupt it.

Since we wrote to them, Facebook, now Meta, have been batting away a conveyor belt of safety scandals with obfuscation and denial, with our questions and concerns being met with unsatisfactory answers.

What is clear is the scale of abuse children face on their sites. 

Every year, Instagram alone is used in around a third of reported grooming crimes on social media. Crimes that would go undetected under Meta’s blanket end-to-end encryption plans.

It was encouraging to read in The Telegraph that the company is pausing the rollout until 2023 to consider the child protection implications.

As we’ve always said, Meta should only go ahead with these measures when they can demonstrate they have built technical mitigations that can ensure children will be at no greater risk of abuse.

But read closely and Antigone Davis offered nothing new. 

It was strong on rhetoric but light on detail, and it made it difficult to conclude anything other than this being a move to play for time while the tech giant weathers difficult headlines.

Ms Davis cited WhatsApp as an example of action taken against abuse in end-to-end encrypted environments, but this isn’t the silver bullet that Meta likes to suggest.

The figures speak for themselves.

In 2020, the National Crime Agency received around 24,000 child abuse tip-offs from Facebook and Instagram but just 308 from WhatsApp.

WhatsApp data show that less than 15 per cent of accounts they suspend for child abuse lead to actionable reports to police. Meta knows abuse is taking place, but they can’t see it and can’t act on it.

Meta could have announced that they would follow Apple’s lead in developing child safety measures that can work in end-to-end encrypted environments.

However, Will Cathcart, head of WhatsApp, previously labelled Apple’s plans “concerning” and categorically refused to take a similar approach.

By sticking with their own status quo and continuing to promote, at best, sticking plaster solutions, Meta still doesn’t have a clear plan to protect children. It is disingenuous to suggest otherwise.

Mark Zuckerberg could take steps today to restore confidence. In May, Facebook’s board successfully blocked a shareholder proposal to risk assess the impacts of end-to-end encryption on child abuse.

They should admit they got that wrong and commit to a full, independent risk assessment. 

Actions speak louder than words.

As whistle-blower Frances Haugen’s revelations show, transparency is key. 

In the past six months, Meta’s latest community standards report revealed a record number of child abuse takedowns.

Almost 50 million items of child abuse material were removed from Facebook and Instagram, more than triple the amount in the previous six months.

Meta had attributed the dramatic increase to them improving their “detection capability” but it is still not clear if the company is playing catch-up following apparent technical problems last year, or if the child abuse risk is ballooning.

It’s in this context that end-to-end encryption sits. We know it could wash away children’s safety and have a substantial impact on identifying grooming and child abuse material. 

But because agencies have no power to ask questions, we have no idea how bad the tsunami will be.

Meta often cites how they will welcome regulation to help guide their response to abuse. But we can’t wait another two years before we can even start to demand answers.

That’s why we are urgently calling on the Government to fast-track Ofcom’s investigatory powers in the Online Safety Bill. Let’s give the regulator the powers to start asking necessary questions and the ability to look at the inner workings of Meta without delay.

The ongoing encryption debate and whistle-blower revelations highlight that Meta can no longer be judge and jury over its own conduct while children’s safety sits on a cliff edge.

We cannot be left wondering whether Meta’s announcement sets in motion a substantial reset of their plans or is just another tactic from their PR machine. 

The Government can take the lead by giving Ofcom the power to demand answers.