By Charlie Martin
Much of our time at Digitalis is spent helping clients shape their own online narratives rather than having search engine algorithms do it for them. The recent Netflix docu-drama, Social Dilemma, has done much to prompt debate on what ‘free’-to-use social media and search engines are doing to our concepts of truth and agency. Millions of us use these platforms daily but we are generally unaware of the cost. This powerful polemic – authored and presented by Silicon Valley refugees from the big tech companies – is clear about how this new information ecosystem is undermining our personal freedoms and society. The focal speaker in the programme is Tristan Harris, an ex-employee of Google, and the co-founder of the Center for Humane Technology which is funded by a range of US philanthropic foundations.
It has struck a chord. Some national newspaper reviews have told readers to “Unplug and Run” and labelled the show as “a wake up call for a world drunk on dopamine”. The Cambridge Analytica leak revealed the extent to which our data was being collected. This documentary helps us understand how this data is used on us. It is presented with admirable passion, and perhaps using some of the emotional and manipulative techniques they accuse the social media platforms of deploying. But it makes a compelling case.
The documentary argues that there is a payback for the personalised experience that these search engines provide. The way tech firms use our data has become intrusive and manipulative. We need to understand that what is presented to us online is not always what we look for, but what these platforms think we want. The internet has grown into a behemoth of information and content, so much so that intelligent algorithms are essential to efficient browsing. The fundamental idea behind gathering information on your digital footprint is not innately wrong, it must simply be acknowledged and regulated.
The French cultural theorist Jean Baudrillard argued that modern media is concerned not just with relaying information, but with interpreting our most private selves for us. Online media guides and influences our opinions and beliefs. The danger does not lie in tailored advertising or suggested friends, but in the ‘disinformation for profit’ business model adopted by social media. The programme argues that “algorithms are opinions embedded in code”. They are, in part, built to gain the user’s attention and keep us online for longer so that they can sell more advertising. As conspiracy theories, scandals and populist narratives can be more engaging than the truth, the algorithms are designed to promote content which captures our attention and time. And once we have seen it, we pass it on - fake news travels six times faster than true news.
Of course, the digital world and its understanding of its users provides luxuries and benefits. There are advantages to websites and programmes which recognise our digital tendencies and provide a more personalised experience – but for this programme’s makers, the line has been crossed. Truth is being manipulated for commercial gain.
Solutions are perhaps less clearly articulated. There is clearly a sense that technology has outstripped regulation, and that the law needs to catch up. The companies have taken on some greater responsibility for managing their content, but they are trapped in a business model that by definition shapes a reality. Our responsibility as users is to be more knowing about the techniques they use, and less blindly trusting of what we see and read. Search engines are primarily marketing tools. Understanding that the online reality with which we are presented is unnatural and manipulated is key to preserving a perception of the real world that is not blurred by misinformation online.