UK Government Select Committee report calls for social media regulation
Media standards must be applied to social media to prevent democratic abuse. This is among the recommendations from a UK Government Select Committee report on fake news.
A new Government Select Committee report tackles fake news, foreign interference in elections, and the EU Referendum. It’s a must read for anyone that has an interest in the impact of social media on public discourse and democracy itself.
“We are faced with a crisis concerning the use of data, the manipulation of our data, and the targeting of pernicious views.”
“In particular, we heard evidence of Russian state-sponsored attempts to influence elections in the US and the UK through social media, of the efforts of private companies to do the same, and of law-breaking by certain Leave campaign groups in the UK’s EU Referendum in their use of social media.”
The report called Disinformation and ‘fake news’: Interim Report is an important piece of work because it shows how public discourse has been polluted by the weaponised used of social media and networks. Along to way you hear from the 60 witnesses that the committee called.
I’d urge everyone to read the report and in particular its conclusions. A final report in expected to be published later this year. In the meantime, here’s what you need to know.
#1 Definition: fake news is fake news
Fake news is applied to describe everything from blatant lies to any subject with which an individual or organisation disagrees. In an ironic but predictable twist, the report has itself been called out as fake news by critics. It calls for the Government to reject the term fake news and put forward a definition of the words misinformation and disinformation.
A shared definition, and clear guidelines for companies, organisations, and the Government to follow, there will be a shared consistency of meaning across the platforms, which can be used as the basis of regulation and enforcement.
The report goes further and call on the Government to support research into the methods by which misinformation and disinformation are created and spread across the internet: a core part of this is fact checking. The report recommends that the Government initiate a working group to create a credible annotation of standards, so that people can see, at a glance, the level of verification of a site.
#2 Social media governance
Legislation lags the development of technology, however the Communications Act 2003 provides the means to enforce content standards for television and radio broadcasters, including rules relating to accuracy and impartiality, as a basis for setting standards for online content. Ofcom is set to propose plans for greater regulation of social media this autumn.
Electoral law needs to be updated to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, micro-targeted political campaigning, as well as the many digital subcategories covered by paid and organic campaigning.
#3 Advertising transparency: Facebook criticised
As well as having digital imprints, the Government should consider the feasibility of clear, persistent banners on all paid-for political adverts and videos, indicating the source and making it easy for users to identify what is in the adverts, and the advertiser.
The globalised nature of social media creates challenges for regulators. In evidence Facebook did not accept its responsibilities to identify or prevent illegal election campaign activity from overseas jurisdictions. In the context of outside interference in elections, the committee says that this position is unsustainable. Facebook, and other platforms, must begin to take responsibility for the way in which their platforms are used.
#4 Platform versus publisher, and call for a third category
The report recommends that a new category of tech company is formulated, which tightens tech companies’ liabilities, and which is not necessarily either a platform or a publisher.
Social media companies cannot hide behind the claim of being merely a ‘platform’, claiming that they are tech companies and have no role themselves in regulating the content of their sites. They are also significantly different from the traditional model of a ‘publisher’, which commissions, pays for, edits and takes responsibility for the content it disseminates.
#5 Call for a Code of Ethics for social media
A professional global Code of Ethics should be developed by tech companies, in collaboration with this and other governments, academics, and interested parties, including the World Summit on Information Society, to set down in writing what is and what is not acceptable by users on social media, with possible liabilities for companies and for individuals working for those companies, including those technical engineers involved in creating the software for the companies.
#6 Digital literacy needs tackling in schools
The report recommends that the Government put forward proposals for an educational levy to be raised by social media companies, to finance a comprehensive educational framework. Digital literacy should be the fourth pillar of education, alongside reading, writing and maths.
The DCMS Department should co-ordinate with the Department for Education, in highlighting proposals to include digital literacy, as part of the Physical, Social, Health and Economic curriculum (PSHE). The social media educational levy should be used, in part, by the Government, to finance this additional part of the curriculum.