Coronavirus has accelerated the need for Boris Johnson to clamp down on tech giants exposing children to paedophiles and harmful content online, the NSPCC has warned.
The charity has warned that the demand for child sexual abuse images online has been increasing during the pandemic as it set Mr Johnson six test that plans to tackle harm online.
The NSPCC has said that the details of the planned Online Harms Bill, expected in the coming weeks, must create a “world-leading” system to protect children or risk letting companies continue to profit from abuse.
It comes as analysis revealed that the number of online sex crimes against children recorded by police reached the equivalent of 101 a day in England and Wales between January and March this year
The demands include tough criminal and financial penalties for named managers who fail to act to stop abuse, and a responsibility for taking down content that fuel self-harm and suicide, which is often legal but extremely harmful.
It also calls for private messages on social media sites to be screened for known child abuse images to catch people distributing the criminal material.
NSPCC CEO Peter Wanless warned that social media giants have put “growth ahead of children” said that failing to pass any of the six tests will mean children will pay the price with “serious harm and sexual abuse that could have been stopped”.
Mr Wanless added: “Industry inaction is fuelling this staggering number of sex crimes against children and the fallout from coronavirus has heightened the risks of abuse now and in the future.”
He also called on rules that require companies to eradicate high-risk design features, like the ability to chat to child users privately, that facilitate abuse.
But Mr Wanless said that the last six months have raised the stakes, warning that Covid has “systematically changed the threat that children face online” with more of them reliant on live streaming to socialise and learn.
Even more troublingly he suggested that “the demand for sexual abuse images has been increasing” during lockdown with paedophiles potentially working from home and able to engage in abusive behaviour more often and unsupervised.
Ian Russell, whose daughter Molly took her life after she was targeted with self-harm posts on social media, said “tech self-regulation has failed as he backed the charity’s call.
He said: “Today, I can’t help but wonder why it’s taking so long to introduce effective regulation to prevent the type of harmful social media posts we now know Molly saw, and liked, and saved in the months prior to her death.”
And one anonymous mum, introducing herself by the pseudonym Jane, revealed the heartbreaking story about our her daughter had fallen victim to the wild west of social media.
Jane’s nine-year-old-girl was targeted by a user of social media app Likee who threatened to take her away from her mum if she didn’t send pictures of her body.
After speaking with her daughter, Jane accessed her profile, and she was given a disturbing insight into the kind of messages her daughter had been receiving.
Within 30 minutes of using the app, the mum received messages from strangers calling her babe and sexy and asking for sexualised pictures.
Her daughter “broke down” when she revealed the messages to her mum, Jane said.
But after the shock had passed Jane said she became furious and began to demand action.
She said: “How on earth could this happen? How is it possible for a paedophile to easily be able to contact my daughter on an app that’s designed for children?”
For months after the incident, Jane said her daughter lost her “glorious smile” and would burst into tears.
A could of years later Jane said she still struggles.
“She told me recently that she had a dream about a man, like a man from Likee, who pushed her to the ground and grabbed her ‘butt'”.
She added: “If these six measures are put in place, it might stop other families having to go through all the terrible pain that we have gone through in the last year.”
Ministers have said they plan to introduce an independent regulator with its own powers to pursue companies who endanger children.
Last night a government spokesperson said: “Protecting children will be at the heart of our new laws to make the UK the safest place in the world to be online.
“Social media companies will need robust systems in place to keep their users, and particularly children, safe and there will be tough sanctions for those that do not fulfil their duty of care towards them.
“We have been working closely with NSPCC colleagues in developing our plans, and thank them for their contribution.”
The NSPCC’s six demands for any system designed to keep children safe online
- Create an expansive, principles-based duty of care
- Comprehensively tackle online sexual abuse
- Put legal but harmful content and an equal footing with illegal material
- Have robust transparency and investigatory powers
- Hold industry to account with criminal and financial sanctions
- Give civil society a legal voice for children with user advocacy arrangements