Social media platforms aren’t doing enough to support human beings let alone locally elected officials. Regulation is now a necessary step, starting with an ombudsman.

Back in around 2007 I was a social media pioneer for the Conservative Party. I set up a Page for David Cameron shortly after the Pages feature launched – it is now hard to imagine a time when it was just private profiles. At the time, the senior figures at Conservative HQ did not know what to make of the platform let alone how to use it effectively, so when I took the initiative and ran the Page for them on an unofficial basis they trusted me and took a small risk. You would never get away with it today, but back then the attitudes towards social media and the people who used it were very different to today’s world. It felt like a low risk because trust in people was high.

Fast-forward ten years to 2017 and the Welsh Local Government Association raising concerns for the safety of local councillors due to abuse online – with one councillor being told they would be “torn apart” by fox hounds, and another reporting their children being bullied at school for decisions they’ve made whilst in office. Not only has trust in people to behave themselves evaporated, but simple civility appears to have gone too.

I’m no stranger to abusive online behaviour myself having made a decision in the summer of 2019 to withdraw a musical act from a council-run festival on the basis of pre-event publicity issued by the band indicating they intended to use the stage for the promotion of political messaging that could have risked the public’s cohesion at the family-focused event. As you might expect there were some who did not agree with the decision.

My face was superimposed onto images of Kim Yong Un on social media, but I could handle that. It was the comments that got me. The band in question several posts attacking me or directing attacks towards me of a personal nature, and of course one young man joined in, making comments that were designed to be threatening, intimidating, abusive and cause distress.

I duly reported the comments to Facebook who deemed them as ‘not in breach’ of their Community Standards, even though they clearly were. I appealed and got the same answer almost instantly. It was clear that nobody had looked at the content and its context properly. It felt like nobody was listening.

My last resort was to contact the police, who confirmed there was enough material for them to take an interest from a criminal perspective, but how can it be right to send precious police officers after comments on social media when there are greater real world harms to focus on? I am not downplaying the damage caused by online harms – it just seems like a disproportionate response. As Conservatives we believe in improving the life chances of young people, not stifling them because of their youthful online behaviours.

The solution that would have worked for me – and many others – would have been if Facebook were obliged to follow its own Community Standards and correctly enforce them. In my case that was the initial point of failure, and I attempted to use Facebook’s own appeals process without success. It cannot be right that Facebook gets to be both the first hearing judge and the appeal judge at the same time – in my case only seconds apart from each other. 

There is no further route of appeal – there is no Social Media Ombudsman, but there should be. How can it be right that we have an ombudsman who can hear complaints about billing errors on your gas and electric accounts, but we don’t have an ombudsman who is able to at least review the compliance levels of social media companies to their own policies.

Social media giants know how to use AI effectively to police their platforms. They do it for child exploitation material for example – they dare not take such a laid-back approach as ‘flagging’ for this kind of disgusting content as they know agencies like the FBI or NCA will shut them down.  It is easy for AI to analyse and interpret this grossly indecent material – but it is not so easy for AI to understand the context of threats and insults left in the comments. For this we need compassionate and well-trained human beings who take an active interest in understanding the online harm that has been alleged. Where they fail there should be a human-operated process, including parts external to the social media company, to help ensure confidence and faith in the whole process.

For too long the industry has been able to mark its own homework, and for too long there has been a dependence on ‘flagging’ content – a system that was originally deployed so that the owners of content on the web could shrug their shoulders over materials posted on their servers. Even today they shrug when it all goes wrong and simply let the police pick up their mess.

Social media companies pay lip-service to victims and regulators and are experts at pulling the wool over the eyes of scrutineers. They’re also able to exert extreme pressure on those who challenge them with their extensive reach into companies and public life, and sometimes they don’t even need to try – simple and persistent intransigence is often enough, as it was in my case.

Where I was once an early pusher of social media, a pioneer and great advocate I now find myself looking back at what I helped to build up with a sense of regret and no real sense of optimism about any solutions I’ve heard about so far. Without intervention I fear for the future of local government. Colleagues tell me about abuse regularly, and when I attend conferences it is a hot topic and one that is putting people off becoming a Councillor. Who would be a councillor if you know you are going to get daily abuse online? Who will come forward to be an MP if they know deranged people, spurred on by social media content, might brutally murder them in the street?

One internet giant started life with a slogan: do no harm. It is time for us as a society to enforce it and as Conservatives it is within our gift to do so in a way that does not stifle innovation or restrict freedom of speech. Let us not confuse freedom of speech with freedom of platform. Let us not pretend this is not our problem to solve.