From Bernays to TikTok: the rise of algorithmic propaganda that we didn’t see coming
Edward Bernays’ ideas of persuasion and propaganda are alive and well, operating invisibly through algorithms.
We didn’t cancel propaganda. We coded it.
Edward Bernays’ century-old theory of engineered consent has not disappeared. It has been digitised. Algorithms now apply his logic through platforms that shape what we see, feel and believe, invisibly and at scale.
We associate propaganda with black-and-white posters, authoritarian states and crude messaging. But as Charles J. Wolf argues in his sharp and timely paper From Propaganda to Platform, we’ve misunderstood how propaganda has evolved. It hasn’t been replaced. It has been automated.
What’s striking about revisiting Bernays in 2025 isn’t how outdated his thinking seems, but how relevant.
We often convince ourselves that we’re navigating something radically new. The digital age demands a completely new playbook. But scratch the surface of today’s platform logic and you find the same old foundations: emotional appeal, engineered consent, and a deep understanding of human psychology.
There are no new ideas. Only new infrastructure.
Bernays by machine
Bernays believed that people act irrationally and respond to emotions more than to facts. He argued that democracy needed guidance from those who understood the principles of mass psychology. He applied emotional cues, symbolism and trusted voices to shape public opinion. Where people once delivered these techniques, machines now do.
It’s why Bernays remains controversial. His work for the tobacco industry, political campaigns and corporate America is often condemned as manipulative, unethical or corrosive to the public sphere.
But here’s the uncomfortable truth for public relations practitioners. Many of the tools he championed are still with us. They’ve been re-engineered into code.
Platforms do the persuading now
In the twentieth century, public relations teams secured endorsements and managed media to influence audiences. Now, that function is handled by YouTube’s recommendation engine, Facebook’s feed and TikTok’s For You page.
These systems don’t prioritise truth. They optimise for attention. Emotional content dominates. Outrage grabs attention. Nuance gets buried.
Algorithms measure what keeps us engaged. If a post sparks a reaction, especially anger or fear, the system serves more of the same. As Wolf points out, the logic is circular. The audience shapes the content it sees, and the content, in turn, reshapes the audience.
We’re not being persuaded in the traditional sense. The invisible incentives of platform logic are conditioning us.
The Cambridge Analytica moment
The 2018 Cambridge Analytica scandal marked a turning point. The firm scraped personal data from Facebook, built psychographic profiles and delivered hyper-targeted political messages designed to manipulate personality traits. This wasn’t just segmentation. It was emotional micro-targeting on an industrial scale.
Wolf rightly identifies this as a leap. From broadcasting to personalisation. From persuasion to manipulation.
Propaganda becomes participatory
Here’s the kicker. Today’s propaganda doesn’t just flow top-down. It is co-created. Every like, swipe and share feeds the system. That’s what makes it reflexive - shaped by users and refined by machines.
Influencer culture illustrates the shift. Bernays used public figures to confer trust. Now, influence is built on relatability, not authority. Emotional authenticity, or the appearance of it, is more persuasive than institutional backing.
Engagement is not the same as accuracy
Platforms don’t reward accuracy. They reward engagement. That is the economic model.
Wolf makes a critical point. Some modern campaigns aren’t trying to persuade. They’re trying to exhaust. Flooding audiences with contradictory messages doesn’t build belief. It breaks resistance. Confusion becomes the goal. Paralysis becomes the outcome.
Propaganda doesn’t declare itself. It is ambient. Invisible. Built into the infrastructure of everyday digital life.
What must communicators do next?
Wolf’s argument is a challenge to communicators. If influence now lives in algorithms, and emotional manipulation drives engagement, what does that mean for those of us who shape narratives?
Bernays warned that careless propaganda could erode democracy. Today, platforms do the same invisibly and without accountability.
The question for communicators is whether we choose to lead the response or follow the platform's logic. We must take ownership. The responsibility belongs to all of us in public relations practice.
Reference
Wolf, C. J. (2024). From Propaganda to Platform: The Algorithmic Resurrection of Edward Bernays in the Age of Mass Customisation. Independent Researcher.