Constituency matters… a weekly column by the Member of Parliament for Folkestone and Hythe, Damian Collins 6 December 2022
This week the Online Safety Bill returned to parliament, for the first time since the government changes of the summer and autumn. As readers of this column will know this is an issue I have been campaigning on for years and was for a time also the responsible Minister for this Bill at the Department for Digital, Culture, Media and Sport.
Truth is drowned out by disinformation
Online platforms like Facebook, Twitter, TikTok and YouTube are increasingly the public square of our society, the place where people go for news and to exchange ideas. They allow people to speak out and reach new audiences with greater ease than ever before. However, they have also become places where truth is drowned out by disinformation, public health undermined by conspiracy theories, people daily face intimidation, and children are bullied and encouraged to self-harm.
Hold platforms to account
The laws we have established to protect people in the offline world, need to apply online as well. To achieve this, we need to hold platforms to account for enforcing the community standards they promise to their users, and to prevent their services being used to break the law. This is what the Online Safety Bill has been designed to achieve.
Dymchurch Primary school
Media literacy plays an important role in giving people the tools to keep them safe online, particularly children. I recently attended an assembly at Dymchurch Primary school where Google delivered a presentation to the students on what they should do if they encounter harmful or suspicious content online. However, for most users their experience on social media platforms is not that of searching for content, but content finding them. Recommendation tools are data profiling users in order to provide them with the content they are most likely to engage with. This process is designed to hold people’s attention for as long as possible, and to prompt them to return as often as possible. These systems were not always central to the user experience on social media but have been developed for business reasons by the platforms.
‘Blackout challenge’
I have been extremely concerned to read of cases, like the death of twelve-year-old Archie Battersebee after taking part in a deadly ‘blackout challenge’ which was circulating on TikTok.
In September this year a coroner’s court also determined that a fourteen-year-old girl, Molly Russell, ‘died from an act of self-harm whilst suffering from depression and the negative effects of on-line content.’ The truth is that if you are a vulnerable person, those vulnerabilities will be detected by social media platforms and will influence what you see. Someone who is already self-harming is more likely to see content that will encourage them to continue and do worse.
Bullying online
Recent data published by NHS Digital has shown that one in eight of all 11 to 16-year-old users of social media reported they had been bullied online. Among all 17 to 24-year-old social media users, one in five young women had experienced online bullying compared to half that number amongst young men. There was a time that a child could escape a bully and find a safe place at home, but there is no sanctuary when the bully is in the palm of your hand, along with other apps you use daily to work, socialise and stay connected.
Social media platforms used by people trafficking gangs
I have previously raised at Prime Minister’s questions the fact that social media platforms are also being used by people trafficking gangs to recruit people on the promise of being able to smuggle them into the UK on small boats crossing the Channel. This illegal activity will also be included amongst the offences covered by the Online Safety Bill, meaning that if companies don’t do all they can to stop this activity on their platforms and remove such content then they could face large fines, which for some could be in the £billions.