In their report released on Monday the House of Lords’ Democracy and Digital Technologies Select Committee produced a comprehensive overview of problems it considers to be of high importance to a functioning democracy.
Nevertheless, as with any document that attempts to be so comprehensive, it has strong and weak points. The evidence it commands is of very high quality. The recommendations for democratic reforms are good, although some important points could be expanded. However, turning to the policy of the moment, regarding Online Harms, the document holds faith that a state regulator with as yet undefined powers is the best way to deal with the limits of acceptable speech.
Let us start with the good parts:
We welcome many of the reforms in the report that relate to data driven political campaigning. In particular, the recommendation to put the ICO’s Draft Framework Code of Practice for the use of Personal Data in Political Campaigning (“the guidance”) on to statutory footing is very important.
Similarly, the recommendations around updating and harmonising electoral law, and increasing powers of the Electoral Commission to investigate bad actors with the threat of meaningful fines should be applauded. All of these are all common sense suggestions that risk slipping down the policy agenda in the space between elections.
Similarly, we welcome the scrutiny placed on UK political parties’ use of personal data. Quoting the report, the Labour, Liberal Democrat and Conservative parties’ “understanding of the provisions of the DPA appear to differ widely and seldom conforms to the best practice suggested by the ICO”.
In addition the legality of parties processing of special category data, particularly that exclusive of political opinions (to which they are entitled to under certain conditions), was questioned in the report.
Most explosive however is the allegation that the Labour and Liberal Democrats’ statement that they do not purchase data “directly contradicts the evidence given to us by the Open Rights Group”. Several other contributors also gave evidence that suggested that this was unlikely to be the case. Upon being asked for confirmation that they did not use “data brokers’ services” during the 2019 General Election, the Labour Party refused to reply.
Similarly, the Conservative Party did not state whether they mix commercially available data with other personal data to profile individuals. ORGs recent report provides strong evidence that both do. The only adequate excuse would be over a misunderstanding of what a ‘data broker’ is and what services they provide, and the extent to which that entails personal data.
This report could say more about the use of data in politics. Here are three questions that still remain:
- The report recommends that the ICO’s draft guidance is put on a statutory footing. What does this mean?
This could mean a range of things. The guidance should provide greater legal clarity than the explanatory notes to the DPA, which lists an expansive range of activities as “democratic engagement”. Legislators must be careful to ensure the guidance is given significant legal weight, and has real teeth.
- Where are the audits carried out by the ICO on UK political parties?
Elizabeth Denham said in evidence that “We are really fortunate in that we can audit not just in the regulated period of party activities but 365 days a year and get to the bottom of the data aspects of political campaigning.” We agree, and would like to see such auditing become standard, legally required practice from the ICO. The ICO however did carry out such an audit on UK political parties back in 2018. Two years later there is still no sign of them being published. Can the ICO commit to a date for publication?
- Shouldn’t a valuation of “free” datasets be included in the Electoral Commission’s expanded remit?
Elizabeth Denham said in evidence that “It should not be the party with the biggest budget which can reach the Electorate.” However much data used by political parties is often obtained both legally, and freely. Nevertheless, they can confer a great electoral advantage to a political party – particularly large incumbent parties who have built up datasets over the course of successive campaigns.
There needs to be a recognition that data, just as much as money, is the currency of modern campaigning. Indeed the use of data obtained for free or very low cost (such as email addresses) can itself be used as a cost cutting measure. There needs to be a statutory duty for the Electoral Commission to accurately value these datasets and incorporate them into a cost based regulatory framework.
Open Rights Group made this point to the Select Committee. Louise Edwards of the Electoral Commission, told the Committee that the value of data is “difficult to calculate”. This is not a sufficient reason to explore policy solutions to this pressing issue. There are options, including tracking the value when similar data is supplied commercially.
At the core of these issues, as noted by the Select Committee, is trust. The trust of the electorate, is needed so that democracies can function, and is put at risk by these practices. The Select Committee report has made good progress here, but Parliament must go deeper and further if they are to restore the UK electorate’s trust in our democracy.
The Select Committee identified the Online Harms agenda as a fundamental step to counter digital threats, encouraging the Government to introduce legislation within a year. This includes the “duty of care” — whose definition is likely to be broad and, in the view of the Lords, should also encompass misinformation, “democratic harms”, and radicalising contents. As a whole, we have reasons to believe this may be a hasty and ultimately contradictory approach.
First of all, placing misinformation into the hands of a state regulator ought to be the easiest way available of showing why the current Online Harms model could end up causing extremely difficult problems for free expression. Asking companies to remove legal political speech on behalf of Ofcom should in itself sound problematic to the custodians of our constitution at the House of Lords.
On top of that, the “duty of care” would make online companies responsible for our safety by subordinating our freedom of speech to a balancing exercise against an ever extending range of conflicting interests, harms, and activities — an “health and safety” preventative approach which would undoubtedly end up sacrificing legitimate forms of expression in the name of their potential for harm or controversy. Even more worryingly, questions such as what constitutes acceptable speech, the thresholds for preventative action, and the nature of evidence to demonstrate real and substantial risks would be taken in the first instance by social media companies, which are naturally prone to stay on their own safe side rather than to uphold our freedom of speech.
Finally it is worth noticing that the Online Harms regulation was promoted under the premise that social media platforms were neutral actors, willing to “acknowledge their responsibility to be guided by norms and rules developed by democratic societies”. However, the evidence emerging from the Report seems to be telling a rather different story.
The Lords extensively analysed the role that online platforms are playing in shaping the online environments we live in, for instance by designing and training algorithms that promote polarising views and click-baits. This comes in conjunction with news that exposed how Facebook and the U.S. Government agreed on a content moderation policy suited to their own interests, and how far-right outlets were deliberately allowed to break the rules and weaponise facebook groups, in order to spread misinformation and inflate their outreach. Given these premises, we are of the view that the Lords should have asked the Government to reconsider their approach to Online Harms: entrusting social media with the duty of policing free speech seems rather inappropriate, knowing that online platforms have been actively siding with the bad actors they are supposed to protect us from — either for their own motives or as the result of political pressure.
Instead, a radical shift of approach is needed, and focus should be placed upon making online platforms more transparent, accountable, and open for external scrutiny. Accountability can be created through courts and independent non-state regulation. It should not be subject to potentially politically guided regulatory action, nor should state regulation of online speech be set up as the UK’s good example to Russia, Hungary, Turkey and China.
The Select Committee lacked some incisiveness: for instance, the report does little to spur regulators to take decisive actions against political parties, and doesn’t take full stock of the groundbreaking implications of their findings for the Online Harms agenda.
On the other hand, the Select Committee acknowledged the importance that data driven electoral practices have for the functioning of our democracy, and the need to set higher legal standards in order to hold political parties into account. Also, the report brings an important perspective on the role of social media giants in fostering the threats we face on the Internet.
A shared point of concern for both ORG’s Data and Democracy and Online Harms work is the proposed extension of the “duty of care…to preventing generic harm to our democracy”. It is unclear what exactly this means — indeed, no examples of generic harms to democracy were explicitly outlined in the report. The invocation of democracy however, extends the remit of proposed online harms legislation to include the democratic character of our society. The potential for both abuse and mundane bungling of what this means is, by our account, more likely to be a hindrance than help to UK democracy. The Government should not point Ofcom towards regulating online democratic debate.
It is now on the Government and the UK Parliament to translate this work into actionable and effective steps to restore trust in the digital age. In doing so, they won’t be left alone: ORG will keep fighting against censorship. We will press for political parties to be properly regulated in their use of personal data and technology, just like the rest of us, so that they are prevented from undermining the democracy that sustains us all. In the meantime, you can follow our updates, check what political parties think you are, or join us, and help us shape the future of our digital lives.