The Cambridge Analytica scandal has portrayed data mining, Facebook and democracy in a shocking new light, ultimately causing CA to seek insolvency. We take a look at what happened when the scandal unfolded.

Words: Becky Kells. Images: Josh Krohn 

Most Facebook users have reached a screen which prompts the sharing of data with a third-party app. But until the Guardian’s Cambridge Analytica expose went public last month, very few people had contemplated what this sharing of data truly meant. Cambridge Analytica (CA) – a company which specialised in mining, brokering and analysing data, and then targeting groups based on that data using strategic communication – appeared to have played a significant role in the US election campaign in 2016. The exact scope of Cambridge Analytica’s influence on political elections is still yet to be fully determined, but the company faces investigations in the US and UK alike, and has now declared insolvency. Facebook, too, is embroiled in the scandal – the data was pulled from a vast number of Facebook profiles, under the guise that it was acquiring users’ profile information for academic research. Although only 270,000 people gave permission for the app to access their data, up to 87 million profiles were compromised due to being linked to the friend networks of the initial 270,000 users. Much of the details which now unite CA and Facebook as suspicious parties came from the Guardian's special report.

Placeholder
CA was playing with the psychology of an entire nation - Christopher Wylie

Christopher Wylie, a former employee of CA, has been the main whistle-blower, showing documents to the Guardian to provide a picture of CAs – and Facebook’s – movements. Speaking of the involvement in the 2016 election, Wylie described how CA was “playing with the psychology of an entire nation in the context of the democratic process”. Facebook first denied the breach, but is now complying with investigations, and making significant alterations to its privacy policy. CA denies what Chris Wylie has said, and is seeking legal action against the Guardian for its series on the company. In a statement on its website, CA said: “Chris Wylie was a part-time contractor who left Cambridge Analytica in July 2014 and has no direct knowledge of the company’s work or practices since that date.” From the moment the Cambridge Analytica case became public knowledge, difficult questions have been raised about data mining, storage and use in the age of social media. It is not uncommon for social media companies to store the data of its users - and for the most part, it’s consensual. When users sign up to a website, they enter an agreement with that website in which they set their privacy settings, and agree to what data can be stored. Some argue that this process is good – it brings people adverts and information that they care about and are interested in. Other argues that it doesn’t – it limits the internet experience. This is a particular issue in the build-up to election campaigns, where access to balanced news and a variety of perspectives is essential.

Placeholder
Placeholder
If you think it's OK to cheat during elections or referendums, then what's the point of a free society/democracy?

Before it is even considered how the data is used, the fact it is being stored in the first place is issue enough for many. The make-up of a social media profile, together with smartphone data and online presence, is enough to supply a deep impression of the individual behind the internet. The result is an accurate depiction of everything from what they are likely to buy, to how they are likely to vote in an election – as well as what type of message they would find engaging. Many users have agreed to this, in some context – but they are likely unaware of the range that their profile has, and even less aware that the process of collection and analysis is taking place, at the hands of companies such as CA. The current focus on CA has raised an important question: can much more sinister activity be carried out, disguised by broad industry terms such as “branding” and “research”? Cambridge Analytica’s role in the 2016 US election began during the presidential primaries, and continued until Trump’s election. A number of activities during this period have sparked suspicion: Alexander Nix wanted to collaborate with the WikiLeaks founder, for example, to find Hillary Clinton’s now- infamous deleted emails. Yet CA now finds itself embroiled in a much more serious and international US inquiry: it looks into Russia’s alleged attempts to interfere with the US elections, and questions whether CA had a role in coordinating the spread of Russian propaganda at this time.

Placeholder

CA has also been called into question much closer to home. The information provided by Wylie to the Guardian indicates that a company linked to CA may have had a role in targeting voters in the EU Referendum – specifically, it allegedly spurred “persuadable” voters into activity via social media campaigns. The claims of Brexit involvement are bolstered by Shahmir Sanni, a digital strategist and volunteer who worked for a pro-Brexit campaign group called BeLeave. Back in 2016, Vote Leave had hit its campaign spending limits – a constraint that is in place to ensure that no campaign has a financial advantage. Just days before voting day, a £1 million donation was made to the campaign. Sanni was led to understand that BeLeave would be entitled to this money, but the £1 million never reached any actual Vote Leave subsidiaries or groups. According to Sanni, the money went straight to Aggregate IQ (AIQ). The Guardian’s expose marked out AIQ as an organisation with direct links to CA; Wylie claims that he worked on AIQ at the same time that he worked on CA. It was alleged that the £1 million was used to tailor Brexit messages to people’s online profiles, in an attempt to influence their voting decisions. In 2017, an investigation into the Vote Leave campaign’s alleged breach of spending began. Vote Leave maintained that it was within the law, stating that it did not have any involvement with other campaigns with similar ideologies – such as Sanni’s group, BeLeave. Sanni directly disagreed with this – he told the Guardian that there was a Google Sheet which leading members of Vote Leave removed themselves from in 2017, when the investigation into spending began. While investigators have now determined that a relationship between Leave EU and CA did not "develop beyond initial scoping work", Leave.EU has been fined £70,000 for cheating in the referendum. As Aaron Banks of Leave EU scrambled to defend the cheating, Sanni responded on Twitter, stating: "Justifying cheating because it was by a 'small amount' or because it 'happened a long time ago' is undermining democracy. If you think it's OK to cheat during elections or referendums, then what's the point of a free society/democracy? No one is above the law." 

Placeholder
Placeholder

There are commentators who have downplayed the efficacy of what CA does – including, bizarrely, CA itself; it sought to remind critics that elections are won or lost by candidates rather than data science. Such a stance is backed up by some political scientists. They suggest that online data mining can produce evidence of changes and trends, but cannot shed any deep psychological light on the psyches of users – much less predict how they may be compelled to vote. The role of Facebook in all of this has not escaped scrutiny, either. While it is co-operating with investigations, and has announced its suspension of SCL Group (the parent company of CA) accounts which failed to delete improper data, Facebook has still been widely criticised in the press. It did not alert users to the data breach in 2015, in spite of the fact it knew it had happened. To explain this, Mark Zuckerberg appeared before Congress on 11 April for questioning. So far, he has declined to attend similar hearings proposed by the UK, instead sending subordinates across the pond.

Some changes to Facebook's policies on data sharing have already been announced: groups and events will see more stringent privacy features, and organisations will have to prove their benefit to a specific group before being added as potential influencers. Organisations will be banned from accessing private event pages and invite lists altogether. There will also be tighter regulations on check-ins, likes, photos and posts – religion, politics, relationship status, education and work, fitness and media consumption will all be inaccessible to apps. While this will no doubt come as a positive development for many users, it also serves to highlight just how many facets of Facebook profiles organisations such as CA had access to up until this breach went public. The Cambridge Analytica and Facebook scandal has highlighted a worrying blurred boundary between online worlds – previously seen as frivolous and convenience-based – and some of the most serious aspects of democratic process. The Cambridge Analytica and Facebook scandal has highlighted a worrying blurred boundary between online worlds – previously seen as frivolous and convenience-based – and some of the most serious aspects of democratic process. It remains to be seen exactly how CA and Facebook will share the blame – but with CA on the journey to insolvency, and many Facebook users across the world feeling compelled to delete their accounts, it could be that involvement in politics was one step too far.