Accountability for bias in data and algorithms in public relations
Data has overhauled the potential of the public relations business in the last decade.
When I started my career in public relations, planning involved poorly assembled focus groups at best, and matching media titles to a topic or issue at worst.
We’ve come a long way.
Application of data in public relations
In the last week I’ve use data to estimate the size of a market and the best earned and paid media channels; characterised three news issues that motivate a public; and measured responses to a series of behavioural nudges on an internal social network.
There are countless examples of the application of data to daily public relations business.
Colleagues at Ketchum study the impact of language on public behaviour using sophisticated measurement and analysis.
If you give one of our digital strategists a postcode they’ll describe the income, ethnicity and political attitudes of its population. We can tell you the optimal media to reach an organisation’s publics and the cost.
Community managers know the precise moment to publish a message to a social network in order to generate the greatest response. Like most 16-year-old young women, my daughter has a deep understanding of the behavioural characteristics of Instagram. We use technology to gain that insight.
There is no issue relating to public attitudes and behaviours that we cannot explore given access to the appropriate dataset. Modern technology makes data capture, aggregation and processing relatively inexpensive.
There really has never been such an exciting time to work in public relations but the rise of algorithms and data in practice creates significant ethical considerations.
In recent paper called the Datatification: threat or opportunity for communication in the public sphere, published in the Journal of Communication Management, Derina Holtzhausen said that public relations practitioners need to educate themselves on the realities of so-called big data.
Holtzhausen calls for public relations practitioners to get involved in decisions on how algorithms are formulated and targeted.
This starts with education. Practitioners need to work with colleagues in technology to educate themselves about the potential of algorithms and data. We need have a voice in any discussion about their implementation.
“This will require [practitioners] to serve as activists to ensure social justice. They also need to contribute to organisational information widely available,” she said.
Algorithms with attitude
Holtzhausen believes public relations needs to own the ethical consideration of algorithms online. It’s already an issue. There are countless examples.
Demand based pricing is the norm by airline and train operators. If you’ve used a price comparison website you’ll know that data discrimination leads to differentiated pricing structures in the personal finance market.
Predictive targeting based on search behaviours is now standard practice. It’s built into the business models of almost all online retail organisations from Amazon down.
We’ve all been followed around the web by an ad retargeting algorithm, and the impact of bias in newsfeeds is well documented. The implication of bias in algorithms on human behaviour applied at the scale of the internet is massive.
As we delegate responsibility for everyday tasks to computers this issue will become more acute. In the near future algorithms in driverless cars will make life and death decisions.
Software developers that write algorithms must be held to account on behalf of the public. Holtzhausen argues that the business of public relations needs to step up to the challenge.