Make it don’t fake it

It’s incredibly easy to fake followers and engagement on social media. It’s a significant governance issue for the platforms and public discourse.

This blog post starts with a hefty disclaimer. I do not condone any activity that fakes or manipulates social media discourse.

The activities described in this post were undertaken with the sole purpose of demonstrating how easy it is to fake followers and engagement on a popular social networking site.

If you work for a public relations agency or team that adheres to the ethical codes set out by the CIPR or PRCA, repeating this activity will likely result in disciplinary action and your membership being terminated.

Social networks are based on human to human relationships. Faking or manipulating a relationship will fail and result in a breakdown in trust and reputational damage.

President Barack Obama went further in an interview with Prince Harry on the BBC Radio 4 Today programme in December. He said that the abuse and misuse of social media is a danger to democracy and society.

Influence is in the eye of the beholder

Accepted wisdom suggest that the more followers an account has, the greater its influence. Likewise, a greater number of likes, shares or comments, indicates a greater level of engagement.

However in the social media arena it is easy to game and fake followers and achieve apparent engagement.

Follow back is a commonly used approach. Here you follow other accounts in the expectation of a follow back and if one isn’t forthcoming,  you unfollow within 48 hours. You can spot these accounts as the follower and following numbers will broadly be the same.

Engagement is trickier to fake but that can also be achieved using bots that automatically post colloquial comments such as “that’s interesting” or “nice post”. It’s a commonplace tactic on Instagram. The goal is to attract new followers and replies.

There’s another game played by individuals who have created armies of fake accounts.

A study by the University of Southern California and Indiana University (opens as a PDF) of conversations on Twitter published in March 2017 suggested that up to 15% of active Twitter accounts are bots. That equates to as many as 48 million accounts.

These armies, or so called botnets, are available for hire to increase follower numbers, or to fake engagement likes and shares.

Researchers at University College London reported the discovery of a Star Wars botnet in January 2017 (opens as a PDF) with more than 350,000 bots. It showed how these bots were generated and centrally controlled by a bot master.

Twitter is a target for manipulation because of the ease of creating an account but it is by no means the only platform. Facebook, Instagram, LinkedIn and YouTube all face issues with fake accounts.

Building a fake social media account

I’ve run an experiment in the last week in a bid to shine light on the issue.

I created a PR Bot Twitter account called @RbotP. Granted I get zero points for imagination but it’s increasingly hard to create original Twitter names.

I completed the profile, adding stock photos and disclosure information in the biography, and then published a tweet from the account.

These basic processes ensure that an account passes Twitter’s basic scrutiny and gets it listed as a legitimate account.

Next I signed up to a service that recruits Twitter followers to an account. These are easily discovered via a search engine. It offered 10,000 followers for $25. I paid my dues and sat back.

Identifying bots and fake accounts

Various academic studies have proposed means of identifying bots.

  1. Frequency of tweeting
  2. Spam content
  3. Ratio of mobile vs desktop

Indeed Indiana University has developed a free Botometer tool that scores Twitter accounts on the likelihood of being a bot. It’s worth checking out.

Another tool called Twitter Audit takes a sample of 5,000 followers and reports the percentage of fake accounts. It’s not unusual for 5% to 10% to be fake. The University of Southern California and Indiana University study suggested the figure could be as high as 15%.

You can proactively tackle this issue by blocking and reporting fake accounts that follow your account.

"The Twitter Audit tool reported that 98% of 3,300 accounts that follow my personal @blowndes account are real. I've proactively blocked fake followers for a while but I was still surprised that it was that high," said Ben Lowndes, director, Social Comms.

If you want to investigate this issue further, network mapping tools will quickly identify fake networks and echo chambers that exist among bots.

10,000 fake followers

Back to my experiment. Within 24 hours the fake @RbotP account had more than 10,000 followers. They have a series of related characteristics:

  1. Low number of followers
  2. Profile image that doesn’t identify as an individual
  3. Incoherent biographies
  4. Extreme political, porn and spam content

Twitter periodically culls fake accounts from the platform. When that next happens it’s likely that the number of follower on the @RbotP account will plummet. But at $25 for 10,000 in 24 hours it could quickly be topped up.

An account of 10,000 followers will generate an average of 100 to 1,000 impressions per tweet. 1 to 10 people (approx. 0.1% to 1%) RT or favourite and even less (approx. 0.01 to 0.1%) reply.

These figures are typical numbers and will vary depending on the content being shared.

Social media manipulation

Tweets posted from the @PbotR account result in zero engagement. It’s plain to see that the account is fake.

There’s a bot for that too though. The price of a cup of coffee ($4) buys several 100 RTs. I applied the service to this tweet.

If the account had a human portrait avatar, a legitimate name and biography, followed more people, and tweeted a few more times, to the untrained eye it would look legitimate. If it surfaced in your newsfeed I doubt you’d give it a second glance.

Here’s the issue. I could build a hundred of these accounts each with similar profiles and then turn them on a discussion, issue or hashtag, and manipulate discourse on Twitter. It’s the botnet model that the University College London researchers described.

The results of a research project published in IEEE Transactions on Information Forensics and Security in August 2016 discovered exactly this issue. It cited evidence of the manipulation of trending topics on Twitter.

It suggests that it is a significant issue because trending topics influence the media and public agenda of the community. Indeed this type of manipulation has been cited as a potential influencer of both the EU Referendum in 2016, and the 2016 US Election.

Measure what matters

Social media manipulation makes a nonsense of much of the science of measurement. 

"Practitioners should measure the outcome of activity rather than counting followers, likes or shares," said Mandy Pearce, head of public and partner relations, Plymouth City Council.

Alastair McCapra, CEO, CIPR makes a similar point.

"It is precisely the things which are most fakeable that are most measurable. The cult of measurement is powering the tidal wave of fake," he said. 

Conversation on Twitter is increasingly polarised. Fake accounts are put to work to amplify and inflame conversation. You can observe it happening in news and politics day in, day out.

People will always find ways to manipulate new forms of technology. Communicators need to remain diligent and stay on the right side of ethical codes of practice.

We also need to continue to engage with academic colleagues. They are at the forefront of abuse and manipulation.

Application in practice

Here are some recommendations for spotting social media manipulation and staying on the right side of ethical codes of practice.

  1. Review academic research – This is a tricky and under reported issue however the research project cited in this article explores how social media can be manipulated, and in particular provide a window into fake bots on Twitter.
  2. Do the right thing - Social networks are built on human relationships. Use legitimate means to build a network and engage with people on social media. Manipulation and fake activity will fail and result in reputational damage.
  3. Work with the platforms - Each platform provides a variety of legitimate paid solutions and tools that will help accelerate your communication objectives. The critical issue is that this activity is transparent to users.
  4. Spotting fake accounts - Use tools to check the legitimacy of an account. You can start with my @PbotR account. Botometer and Twitter Audit are useful for Twitter. Social networks tools will spot bubbles and fake networks.
  5. Measure outcomes - measure the outcome of social media activity and not metrics that can be faked such as followers, likes or shares. The AMEC Integrated Evaluation Framework is a good measurement model to follow.

This blog was updated on 5 January, 2018 with comments from Ben Lowndes, Alastair McCapra and Mandy Pearce.

Previous
Previous

Emojis in PR: a picture tells a thousand words

Next
Next

Media abstinence improves mental wellbeing