Admins and moderators are the wrong target to tackle social media abuse

Social media community expert Richard Millington suggests Lucy Powell’s proposed bill to clean up private Facebook communities is flawed.

Lawmakers inevitably lag innovation. It has always been the case. It’s impossible to prepare laws that predict the future development of technology.

Lucy Powell MP has introduced a Private Member’s bill in Parliament called the Online Forums Bill 2017-19, aimed at moderating private online communities that allow hateful views to go unchallenged among large groups. She’s the Labour MP for Manchester Central.

“Instead of small meetings in pubs or obscure websites in the darkest corners of the internet, our favourite social media site is increasingly where hate is cultivated,” she said.

Powell is seeking to make administrators and moderators responsible for content published on those forums; to require such administrators and moderators to remove certain content; and to require platforms to publish information about such forums.

There’s no denying the scale or influence of Facebook communities. They’re a highly influential form of media used by families, friends and interest groups. As Powell suggests conversations aren’t always positive. You can read her full speech via Hansard.

As if to prove her point Powell has become the target for abuse and criticism via Facebook and Twitter, accused of seeking the censor the internet, although she has broad ideological support.

I caught up with Richard Millington to founder of online community consultancy FeverBee and author of Buzzing Communities. He fears that the legislation could do more harm than good and in singling out moderators she’s got the wrong target.

“Secret online groups are a powerful form of protection for minorities and those dealing with difficult issues such as domestic abuse, grooming or mental health. In my experience as soon as groups become public they are a target for protagonists.”

“Most moderators in groups are volunteers helping their community deal with issues in their spare time. If they are expected to have legal liability for the content in the community incidents of abuse will rise as moderators vanish.”

“In my experience, the creators of online hate groups are anonymous and using Telegram, the encrypted messaging network, or other anonymous tools. It's impossible to find them. It would be like asking terrorist groups to register themselves when they enter the UK.”

“Powell sought to start a discussion but the conversation has quickly become focussed on hate and ignores the majority of people building secret groups for noble purposes that make a significant contribution to society. We need to support the latter while tackling the former.”

Millington suggests that a potential solution to tackling abuse on social networks is to make it easier to report inappropriate content in secret groups to the police. The participants in a secret group are often identifiable to each other and a screengrab is sufficient evidence for a crime report.


Previous
Previous

Professions versus machines

Next
Next

When is a democracy not a democracy? When campaign overspending influences the outcome of a vote