Sign Up

Facebook and the Global Battle to Sway Minds

Can we protect our societies against “psychographic advertising” – manipulative technologies that stifle democracy?

March 30, 2018

Can we protect our societies against "psychographic advertising" – manipulative technologies that stifle democracy?

It has often been said that data is the new oil. This is true in politics too — mainly, but not exclusively, in democracies.

Recent events surrounding Facebook and Cambridge Analytica, as well as Russia’s multi-country campaign interference, give new urgency to a pivotal question: How can we combat these technological possibilities, with their potential to stifle democracy? How can we keep them from subverting us by exploiting our openness, as George Soros and many others have warned?

What we are facing is manipulation as a means of interfering. Almost anyone – private companies, governments, non-state organizations – with access to users’ data and a certain degree of technical sophistication can carry out such maneuvers.

McKenzie Funk, an Open Society Foundation collaborator and member of the Deca journalism cooperative, calls the underlying mechanisms “psychographic advertising.” Anyone can do it, as long as they have enough know-how.

For all the attention now paid to Christopher Wylie and the machinations of Cambridge Analytica, we ought to remember that the first major politician to rely on these psychographic electoral techniques was none other than Barack Obama.

A new kind of disinformation campaign

To guard against the further rise of these kinds of disinformation campaigns – which really go far beyond just the concept of fake news – we need to do a better job of spreading this truthful news worldwide.

A report drawn up by 39 experts for the European Commission calls for commitment to and support for quality journalism. The experts advocate that groups of volunteers – which are emerging – to report such manipulative activities.

Crucially, the companies that own the networks must also take their share of the responsibility for the custodianship and use of the personal data they collect.

A code of good practice for such platforms is required. Making that a reality won’t be easy. After all, the business model of the big platforms concerned is that users hand over their data in exchange for free connectivity.

Facebook, for instance, derives income from advertising and from users’ profiles and attention, given that attention is another scarce and valuable resource.

Critics are also calling for more transparency in the functioning of the algorithms used by social media and other services. Rightfully so. A study by MIT has found that fake news spreads wider, faster and more deeply than real news.

This is due not only to the role played by bots, or automated programs, but also to our own fondness for novelty. In other words, we are also partly responsible as consumers and users for what is happening.

Teaching people to protect themselves

Hence, there is urgency to teach people how to protect themselves from such potential manipulation. “Citizens have to be equipped with the tools needed to be able to discriminate between truth and falsehood,” in the words of José María Lassalle, the Spanish Secretary of State for the Information Society and the Digital Agenda.

The public can and should learn how to manage the information they receive and send. We need much more technological and media literacy, which is not just a matter of technical capability. It is something that needs to be taught to adults, as well as to young people in families and schools. It ought to be an integral part of all civic education-type subjects or their equivalent in other countries.

The EU can also play its part with its protection measures. The new General Data Protection Regulation (GDPR) finally comes into force on May 25, 2018. It is a major, albeit insufficient, step. The platforms and digital services will end up having to apply it outside the EU too as it is difficult to establish frontiers in this. That is part of the EU’s global normative power.

The positive role of social media

For all the very real perils that need to be addressed urgently, we also need to be mindful of the very positive role that social media can play in the defense of freedom, democracy and citizen participation.

The fight against disinformation is the excuse many dictators use to restrict communication and freedom of expression.

One need not go as far as China to confirm this. As Yarik Turianskyi of the South African Institute of International Affairs points out, at least 10 African countries – Burundi, Cameroon, Chad, the Democratic Republic of Congo, Ethiopia, Gabon, Gambia, Mali, Uganda and Zimbabwe – closed social media websites and/or messaging applications during or after elections in the wake of protests in 2016.

Other countries had to backtrack. In Ghana, for example, the government felt obliged by popular pressure to re-establish these services and the opposition ended up winning the election.

Clearly, all over the world, we need a stronger sense of protection against nefarious schemes to abuse social media channels for disinformation attacks and manipulation.

This is only the beginning. But little by little, societies and institutions are reacting. We are going to see changes and new regulations, even if technology often advances faster than the regulators themselves.

Editor’s Note: Adapted from Andres Ortega’s Global Spectator column, which he writes for the Elcano Royal Institute.

Takeaways

It has often been said that data is the new oil. This is true in politics too -- mainly, but not exclusively, in democracies.

Can we protect our societies against "psychographic advertising" – manipulative technologies that stifle democracy?

To guard against the further rise of disinformation campaigns, we need to do a better job of spreading truthful news worldwide.

For all the perils that need to be addressed urgently, we also need to be mindful of the positive role that social media can play in the defense of freedom, democracy and citizen participation.

We are going to see changes and new regulations, even if technology often advances faster than the regulators themselves.