Blog


January 31, 2020 | Pascal Crowe

APPG on Electoral Campaigning Transparency adopt ORG reforms to electoral landscape

Last summer, Open Rights Group gave oral evidence to the All Party Parliamentary Group (APPG) on Electoral Campaigning Transparency. It was convened by Fair Vote, the Electoral Reform Society and Stephen Kinnock MP. This APPG has been leading the charge to make our electoral laws fit for the digital challenges of the 21st Century. Its final report, ‘Defending Democracy in the Digital Age’, was published earlier this January.?

Many APPGs are mere vanity projects for ambitious MPs that can lose momentum and fail to deliver their promised outputs. This is not one of those. The Secretariat and members of the APPG should be congratulated for such a speedy delivery. This speaks to the urgency of the issues at hand.?

The laws that regulate how much an election campaign can spend, and the ways that data are used in elections, are increasingly intertwined. Not least, this is because the value of datasets used by a campaign are rarely captured by regulators, although this is nominally already required by regulation. This is partially because all spending reportage happens after the fact, so there is no way to track spending in real time, or assess if the campaigns themselves are being truthful about the size of their campaign assets.?

Open Rights Group’s (ORG) Data and Democracy Project made several key recommendations to address this issue, that were adopted in the final report. Here are a selection:

  • Data sets should be assigned a market based monetary value, which can then be included in spending regulation and sums. Although this is nominally required by existing regulation, a new calculus needs to be applied to work out (and perhaps limit) their changing financial value in specific campaigning contexts.?

  • To facilitate this, the Electoral Commission and the ICO should form a joint task force to conduct ‘data audits’ of a campaign’s data assets, such as data sets, algorithms, and social networks. This should also scope for illegal and unethical behaviour.?

  • The Electoral Commission and the ICO should reserve the right to carry out ‘drug tests’ during elections, to ensure that campaigns are complying with electoral and data protection law.?

Although changes to the laws that regulate this activity have never been more needed, frankly that change has never seemed further away. The replacement of the DCMS Select Committee chair, Damian Collins, has come as a shock to many. Collins’ tenure was defined by its fierce scrutiny of the digital campaigning landscape and the actors within it. Many eyes are watching his successor, Julian Knight, who ultimately gets to decide the committee’s direction of travel. There is concern that digital campaigning will now play second fiddle to scrutiny of the BBC.?

ORG hopes that the new chair continues his predecessor’s tenacity in regards to electoral reform. In particular, he should start by safeguarding the work of the sub committee on misinformation. We look forward to working with him on these vital issues.

[Read more]


January 29, 2020 | Javier Ruiz

Is ethical Ad-Tech possible?

Open Rights Group organised a panel discussion on whether Ethical Ad-Tech is possible at the CPDP 2020 conference in Brussels.

Last week ORG was in Brussels at the main annual privacy conference in Europe, CPDP, which stands for Computer Privacy and Data Protection. For the 2020 edition the official theme was Artificial Intelligence, with many sessions dedicated to facial recognition. This focus on AI is not surprising given that the European Union has the regulation of these technologies as one of their main priorities this year.

CPDP2020 Ad-tech panel

Another digital rights issue covered in the event was online advertising. Open Rights Group organised a discussion panel to tackle the question of whether “ethical ad-tech” is possible. Our chair was Ravi Naik, the human rights lawyer leading some of the most interesting privacy-related legal challenges in the UK, including several involving ORG.

Karolina Iwanska, a Mozilla Fellow with Poland-based Panoptykon Foundation, explained how most online advertising works through real time auctions where ads are tailored to the visitors of webpages and users of apps. This is achieved through extensive tracking of our behaviour, with many organisations recording all our activities on the web and categorising us through labels. Our profiles are broadcast to thousands of ad brokers while a webpage is loading — that’s the main reason they are so slow nowadays — and the highest bidder will show you their ad.

Karolina Iwanska ad-tedch is broken

The whole process is a massive breach of the privacy of millions of internet users: profiling with dubious consent and then sending the information, including sensitive data, to an unknown number of entities without any control or security.

Mobile apps do not fare much better. Gro Mette Moen works at the Norwegian Consumer Council, who are very active on digital privacy. She presented their latest report — Out of Control — which looks at app advertising and user tracking. She gave a stark picture where many apps broadcasting all forms of personal data to advertisers and third parties without users realising. The dating app Grindr, which describes itself as “the world’s largest social networking app for gay, bi, trans, and queer people,” was singled out as a particularly bad example.

Norwegian Consumer Council Grindr data

ORG has been doing a lot of work on this area. We have led complaints to the ICO that have triggered a full investigation on the sector, and we co-ordinate a Europe wide effort to bring similar complaints in other countries, sixteen so far. The UK regulator is currently dragging its feet on actual enforcement despite acknowledging the unlawfulness of these practices.

One of the main concerns we hear about forcing online advertisers to comply with the law is that it would ground the internet to a halt. Thousands of online publications would be forced to close down, leading to job losses and a poorer informational environment and public sphere. In short, ad-tech is too big to fail.

The answer to this situation should be the development of alternative business models that respect human rights and are sustainable. Luckily, our panel showed that these alternatives are already a reality.

Ster.nl is the commercial agent for the Dutch public broadcaster, tasked with generating a substantial part of their revenue through advertising. Tom van Bentheim and Linda Worp explained how the organisation moved from what they call “consent based advertising,” i.e. profiling and real time bidding, to a “contextual advertising” model. But why would they do this? The organisation was compelled through regulatory pressure as a public body to obtain proper opt-in consent to personalised ads. This saw their consent rate drop from 100% to 10%, showing how broken the system is and forcing them to look for alternatives to maintain their revenues.

Their contextual advertising model works through sophisticated analysis of the content they provide, including the text subtitles of multimedia pieces. Content is labelled and the most relevant adverts are placed on each piece. The logic behind this approach is that the right people will click through in sufficient numbers, even if the ad is seen by others who don’t fit their idea of the typical viewer. This is going back to how advertising worked before profiling.

The big question is how this approach compares to privacy intrusive ad-tech. The results are really encouraging. In a controlled split test experiment Ster.nl saw the same conversion rates for car adverts with or without privacy intrusion. In the last year their overall conversion rate has increased by 25% after dropping “consent” and profiling, with some pieces seeing up to 70% increases.

Ster.nl contextual advertising

Critics of contextual advertising point that it may not work for smaller publishers and that it is too early to tell. Our panel also included the perspective from the incumbent industry, provided by Nicola Cain, one of the leading privacy law practitioners in the UK who works for a variety of companies in the sector.

Nicola acknowledged the complexity of the ad-tech ecosystem made it impossible to give users realistic information on how their dat is used and by whom. She nevertheless argued that ad-tech is overall a force for good by providing revenue for publications. Nicola explained in detail the legalities of how ad-tech works and hinted that industry is preparing for major changes to comply with privacy laws. Unfortunately, it is unclear what these may be beyond the minimal proposals we have already seen from IAB and Google, and whether there is any sense of urgency.

The panel received very positive feedback and we are looking forward to seeing more successful examples of alternatives to the real-time-bidding privacy invasive model that currently prevails in the ad-tech sector.

You can find the presenter slides below. We will link to the video of the panel as soon as the conference organisers make it available online.

Norwegian Consumer Council

Online Advertising - presentation slides

Ster.nl

Karolina Iwánska

Is Ethical Advertising Possible??- presentation slides

Out of Control - presentation slides

?

[Read more]


December 11, 2019 | Javier Ruiz

Leaked UK US trade talks risk future flow of data with the EU

We?come to the end of the 2019 election campaign, which has seen some huge controversies. Jeremy Corbyn hit the headlines when he claimed that the Conservative government had?agreed to “sell-off the NHS” in a post-Brexit free trade agreement (FTA) with the US. The source of this explosive claim was a trove of leaked documents that apparently had laid quiet for over a month in an online?Reddit forum?without garnering press attention. There are plausible accusations that?Russian trolls may be behind the leak, but this should not deflect from the substance of the documents.

The leaked documents include the minutes of extensive discussions on Digital Trade, including discussions to create an unrestricted “free flow of data” between the two countries, which would put the UK on a collision course with the EU. We already knew that this was one of the negotiating objectives of the US side, but we now have confirmation that the UK is willing to go along with this agenda. ?

These minutes are part of a series of bilateral discussions involving hundreds of civil servants from both countries that have taken place over the past two years, and provide some really important insights into the logic and priorities of trade negotiators. The scope is very broad, covering the full range of issues that could be included in the UK-US flagship FTA the government hopes to fast track after Brexit.??Headlines?have been generated on many topics, including drug pricing, agriculture and product standards, and as trade and policy experts pour through the contents we can expect more.

The documents confirm that the core issues discussed under digital trade correspond more or less to the US?public negotiating objectives?that were released in February 2019: cloud services and the interconnectedness of businesses, cross-border data flows, provisions for preventing computer facility localisation requirements, and stronger protection for algorithms. These provisions are found in various free trade agreements, including the?recently signed?US-Mexico-Canada (USMCA), thoroughly discussed by UK and US officials as the state of the art in digital trade.

US digital trade policy aims to protect Silicon Valley

The minutes make clear that the US digital trade agenda is not just about allowing SMEs in poor countries to sell their wares to a global market, but primarily an active policy to protect large internet companies from regulation.?

US trade officials explain the reports that various digital trade provisions on data flows are designed to solve problems for cloud services. The US claims to have received?complaints from cloud companies?that “their decisions were made based on borders instead of market or efficiency and?the US addressed the concerns in the digital trade chapter.” The US stated position in-caveat that they want to enable national regulators to do their job, but the language of the exceptions in the actual texts of the FTAs - e.g. Art 19.11 USCMA - makes it very hard to implement these in practice.

The notes discuss specific provisions for “interactive computer services” commonly known as platforms.”?These measures include policy issues that go to the core of internet regulation, including the sometimes difficult balancing of the human right to freedom of expression and the protection of vulnerable internet users. For example, intermediary liabilities for platforms are currently being discussed extensively in the UK “online harms” white paper and in the coming review of the E-Commerce directive in the EU. These are important issues that go beyond trade, and policy makers should not be restrained by an FTA when making their assessment.?

Free flow of data with the US risks compatibility with the EU

The US and the UK aim to have a completely free flow of data between the two countries, and believe that this should not create problems for the UK to simultaneously keep similar arrangements with the EU in continuity of the GDPR regime. We are not convinced that this is possible, and renowned privacy scholars such as?Graham Greenleaf?and?Kristina Irion?have raised concerns about the compatibility of free trade agreements on data and GDPR. ?

The leaked notes restate the official position that the UK is looking at a bespoke deal with the EU based on adequacy. However, the detailed notes show that divergence from GDPR is on the table. DCMS officials are quoted as saying that “There are countries that have adequacy decisions from the EU that have signed up to free flow provisions.” This is the first time that the UK has explicitly set this position out on paper, albeit internally to their US counterparts and not to its own citizens. US officials are reported as saying that “conclusions on discussions with EU are that there is no legal reason why you can’t commit to free flow and have adequate data protection – such as through GDPR.” Some countries such as Japan have an EU adequacy decision and are signatories to the free flow of data provisions in CPTPP, but it remains to be seen whether this situation does not create any conflicts in the future.

Despite the emollient tone about the UK maintaining GDPR-based compatibility with EU norms, US officials are simultaneously undermining the EU model, saying that?“the US will want to engage with the UK on the best approach around its future international transfers model... and are proponents of APEC-CBPR model which is based around individual companies rather than whole legal systems. Adequacy is a flawed system that cannot become a global standard and is very difficult for developing countries in particular to adopt.”??US officials go on to explain that “the US has been working with Japan, who are seeking adequacy and operate the APEC-CBPR system. A mapping exercise took place mapping CBPR against the EU corporate rules system, and it was discovered that while there were differences, they were not as extensive as one would presume.”?

The US nudging the UK away from GDPR and EU rules is problematic for the future relation with Europe.?

The?EU Adequacy decision on Japan?explicitly mentions the APEC-CBPR as an example of rules that fail to “create a binding relationship between the Japanese data exporter and the third country's data importer of the data and that do not guarantee the required level of protection”.

Both the UK government and the European Commission need to explain urgently whether these two approaches are compatible and the impact of the US FTA as discussed on the future data flows between the UK and the EU.

Localisation of data and computer facilities?

The latest digital trade measures in FTAs include a ban on the forced localisation of computer facilities. This is a complex issue. On the one hand some technology companies complain that having to build a data centre in every country where they operate is not cost effective or even energy efficient, and there is an element of truth to this. Besides, forced data localisation has traditionally been associated with authoritarian regimes trying to control the communications of their own citizens. But this has changed in recent years. Since the Snowden revelations there has been a breakdown of trust on US-based services, with various countries around the world, including Brazil, openly developing policies for localised infrastructure and even alternative basic internet technologies. This last point is also addressed in some trade policies, which ban governments from forcing specific technologies or standards.

More recently, a new crop of Chinese digital giants - Huawei, Baidu, etc. - has started to challenge US dominance of cyberspace, leading to a new global race over technologies such as machine learning and 5G. There is a renewed interest in national innovation strategies and the use of data for national economic growth, with a prime example being??India.?

Using a different approach to data localisation the UK also uses personal data as a national economic asset. The care.data debacle was an attempt to use the health data of NHS users to attract investment and generate economic growth, and although that programme was stopped the underlying strategy has not been abandoned.?

Some civil society organisations have embraced this framework of data nationalism and localisation as a path towards global economic justice and potentially a more equitable and decentralised form of internet governance. This new “national liberation” of data is presented as a reaction to a new wave of colonialism, this time my digital behemoths. Rights groups instead tend to view with suspicion claims of national interests over personal data, and see data privacy as a fundamental right of individuals that needs to be protected against the intrusive practices of both companies and governments.

The leaked documents show that regulations may require data localisation in a variety of contexts. US officials stated in meetings with their UK counterparts that “decisions about localisations of computer facilities should be driven by costs and climate requirements involved, not by regulations. The scale of interconnection means that costs are driven down dramatically”. However, they also showed that US itself is bound by regulation and struggles with these issues, saying that??“the most problematic area within the data localization issue is health information and the HIPAA (Health Insurance Portability and Accountability Act, 1996), which dictates that cross-border data flows are allowed as long as certain standards are met.”?

A recent example of localisation is a proposed Spanish law that will force all public sector computing facilities to be located within the EU. This comes as a response to the use of online services by the organisers of the independence referendum in Catalonia.

A ban on the disclosure of source code and algorithms will hamper accountability and development

The documents confirm that the US wants to create new protections for algorithms, which generally fall outside of the current framework of intellectual property. Some free trade agreements already contain restrictions on the ability of countries to demand disclosure of software “source code”. The argument is that some countries will use forced transparency requirements for unfair competition, i.e. stealing other people’s work. The source code of software is the human readable text that engineers write to create computer programmes. Source code is covered by copyright and there is a well developed global system of protections. On occasions there is a need to?force the disclosure of the source code?of software for security or other accountability reasons. A ban on source code disclosure is problematic, even if accompanied by an exception for regulators and courts as it sets a higher burden of proof and opens the way for litigation. More broadly, technology transfer has been a common vehicle to advance economic developing in poorer countries, and a ban on software disclosure could put a stop on that for the digital age, locking poorer countries into a subservient position.

The current discussions between the UK and the US want go even further, “from purely proprietary source code issues to the proprietary algorithms that support software.” This is a very worrying development because algorithms as such are not always covered by intellectual protection in Europe. Algorithms are sets of rules to be applied to data, and despite their growing complexity are generally considered as ideas, and intellectual property does not protect ideas, which collectively belong to humanity at large, only specific developments of those ideas and for a certain period of time.?

In the field of patents, algorithms are considered mathematical methods and can only be patented as part of a wider technical solution, but once patented the algorithm is disclosed to the public. That is the essence of patents: disclosure in exchange for a temporary monopoly that supposedly benefits society by extending the state of the art in a particular technological field. Algorithms are not covered by copyright in the same way as source code. A specific embodiment of an algorithm into source code could be copyrighted, but not the basic set of rules by themselves. There are also certain protections for algorithms as trade secrets, and trade agreements also seek to strengthen these. Tech companies see their complex algorithms as something that increasingly deserves protection in their own right and the US is attempting to use free trade agreements to provide for this. ?

Why is the UK doing this?

There are many other issues in the leaked minutes, some that could be relevant to a genuine e-commerce discussion, such as??e- signatures and authentication. Other issues covered are positive in principle but the actual content of the discussions is too shallow and could be seen as a step back. This is the case for example of consumer protection for online activities, which is stronger within the EU framework than in free trade agreements. ?

Other issues are altogether so complex and subject to such huge legislative efforts in Brussels that it would seem foolish to try to lock the UK into any positions in a deal with the US. Telecoms and e-privacy regulation (spam, cookies, etc.) are prime example of these cases. The US is specifically asking for a “a lighter touch approach” in the regulation of “value added” online communications services like Skype, “and in particular not to treat them like public Telecoms providers, e.g. with requirements to make their services generally available”. This is one of the hot topics currently being debate at the EU level.

Generally, the documents show that the US has been driving the agenda, with the notes from the initial meetings in 2017 being mainly reports on the US position, with UK civil servants having little to add. However, the UK is not a passive actor. Officials are aggressively pursuing a strong trade liberalisation agenda, and throughout the reports we can see the evolution of their thinking from generic questions such as “could there be something more ambitious than TPP in the future?” towards more concrete positions. Interestingly, it appears that the UK had been at the front of the plurilateral?digital trade efforts at the WTO?earlier in 2017, before the US got on board, with Liam Fox lobbying the Director General of the organisation.

The UK has developed a framework for its digital trade priorities around various themes: supporting jobs and protecting citizens, etc. This is the first time that ORG has heard if this framework, despite having engagement meeting with officials??firm DCMS and the Department??for International Trade on many occasions. UK officials are quoted in the leaked minutes as elated for “the first chance for the UK to outline and discuss our digital trade priorities in any international forum. We were able to emphasise to US counterparts that we had chosen to share them with the US first.” A lack of domestic transparency in trade discussion is typically defended with the argument that??you don’t want to give away your objectives to your negotiating counterparts. It is unclear why the UK government feels safer discussing their trade priorities with the US than with its own citizens.

Despite the advances in the trade policies manifested in the leaked documents, what the UK position lacks is a clear statement of the specific interests defended in the pursuit of these measures. The US is very upfront on the need to protect US internet platforms from regulations restricting personal data flows or forcing them to disclose their algorithms. The UK is defending the service sector in generic terms, and foreign direct investment in digital, but the government has not explained how British businesses will benefit from the introduction of specific measures in a deal with the US. At face value it would appear that this part of the agreement would disproportionately benefit the US.

The documents also show that trade officials have a very clear understanding of the far-reaching implications of these measures, stating that “many companies across numerous sectors, including agriculture, retail and financial services, are using equipment increasingly reliant on cloud services and Artificial Intelligence.” The Digital Trade agenda will affect the whole of the economy.

We hope the new government - whether led by Labour or the Conservatives - will improve their transparency and engagement, with full disclosure and discussion of the trade priorities with civil society. We also expect the government to explain how the free flow of data negotiated with the US will not harm the existing flow of data with the EU.

[Read more]


December 09, 2019 | Amy Shepherd

Profiling, Political opinions, and Data Protection - The Legal Background

We’re campaigning to stop political parties abusing personal data in their rush to try and win elections. We’re worried that these practices are violating everyone’s rights to privacy and straining the trust and integrity in our democratic system through the increased opportunity of micro-targeting social media ads.

In recent days, we’ve written about what types of personal data parties are processing - they’re scraping electoral registers and other public databases and even buying up commercial data-sets to find out your address, age, marital and employment status, spending habits and social media activity - and possibly other things as well.

We’ve talked about how parties are using this data collection to profile and manipulate you - they’re scoring your likelihood of voting for particular parties or positions, ranking you in order of interest to them and categorising you according to demographic signifiers. All of this is used to decide whether to send you leaflets or online adverts, or even whether to simply sideline you in election messaging altogether.?

To find out your own ‘political data profile’, use our online tool.

We’re still trying to find out more about what’s going on - you can be part of our nationwide research. But from what we’ve already found out, we want to show you why we think that what the parties are up to is not only wrong, it’s possibly illegal too.

THE LAW

Political parties can only lawfully process your personal data if their actions fit within one of the grounds set out in Article 6 of the General Data Protection Regulation 2018 (GDPR).

Article 6 says that for parties to process personal data for political purposes they either have to have secured your permission, or be able to show that what they’re doing is necessary and either there is a public interest or they have a legitimate interest.

If the data being processed is “special category data” - this is data which reveals your political opinions - then the parties must have your explicit consent or show a “substantial public interest”.?

So far, so straightforward. Here’s where it starts to get complicated.

GDPR applies in the UK because of the Data Protection Act 2018 (the DPA), and this law gives political parties two potential loopholes.?

First, the DPA says that data processing can be in the public interest if it “supports or promotes democratic engagement”. This means that political parties could try to claim that their invasive scrutiny of you is lawful purely because they are trying to get you to vote.

Second, the DPA says parties can process data revealing your political opinions (this includes, for example, their scoring of the likelihood you support Brexit) without your consent if they need to do that in order to campaign. This gives parties a potentially very wide discretion to simply assert that they cannot campaign effectively without doing data profiling - and leave it there.

But that’s not the end of the story.

Guidance to the DPA states that for data processing to be necessary, it has to be more than “just useful or standard practice.” It must be a “targeted and proportionate way of achieving [a] specific purpose.”??

The Information Commissioner’s Office has explained this further by telling parties; “You do not have a lawful basis for processing if there is another reasonable and less intrusive way to achieve the same result.”

This puts a requirement on parties to assess their practices, to weigh up the intrusiveness of their data processing and justify their actions against the purpose of winning a seat, referendum or election.

OUR ANALYSIS?

So where does this leave us?

We say that to process your political opinions (this might include trying to guess your connection to other parties), parties need to have obtained your consent. However, no UK political party has ever asked your permission for their data profiling activities.?

Parties are relying instead on the other possible legal bases: public interest, substantial public interest and legitimate interest.

Problematically, the Conservative, Labour and Liberal Democrat parties simply assert that these legal bases apply. None has explained why this is correct. We say that assertion is not enough to be compliant with the law.

In our view, the sheer scope of personal data being processed, including through the use of profiling, rules out any claim that parties are serving a public or legitimate interest.?

To rely on legitimate interest, parties also need to show that they have assessed whether their interest is outweighed by your interests and rights to data protection. There’s no evidence any of the three main parties have done this.

Most of the parties’ data profiling activities also appear to involve data that reveals political opinions. Even basic information such as your age can be used to tailor political messaging, or combined with other data to deduce your political views or affiliations.?

This makes most data being processed “special category data” - which means that parties need to pass the higher threshold of “substantial” public interest. Parties again have given no detailed reasoning to show why we should accept that that threshold has been met. We strongly question whether there is any, let alone substantial, public interest in processing personal data for political campaigning.

Then there’s the issue of necessity. Based on the information parties have given us so far, there is no evidence they’ve considered whether what they’re doing is necessary or proportionate.?

Parties might say that using personal data to tailor messaging is proportionate when weighed against the specific purpose of electoral success in a ward or borough. We argue that it is plainly not necessary to process extensive amounts of personal data just to decide whether to send people political messages, or not. Less invasive and onerous methods could surely be deployed.

NEXT STEPS

We’ve written to Labour, the Conservatives and the Liberal Democrats to set out our position and ask for their response. Depending on what they say we may have to take further legal action.

In the meantime, we’re still gathering information on parties’ data practices - trying to redress the balance between the much that they know about us and the little we know about them.?

We’re asking people all over the UK to send data access requests to political parties using our easy automated online tool. We want to map out what’s happening, and find out if there are differences of approach, for example, between safe and marginal seats, or between regions or urban/rural divides.?

Send your subject access request here.

[Read more]


December 03, 2019 | Matthew Rice

What we've learned from asking political parties: Who do you think we are?

Over 2019, Open Rights Group (ORG) have been exercising our rights under the General Data Protection Regulation (GDPR) to find out what UK political parties are up to with our personal data.?

Staff and supporters wrote to parties across Great Britain to ask them what personal data they were holding. This gave us a sketch of how data is being used to profile, target and shape voters intentions.

We’re now asking hundreds of people across the UK to send similar data requests, so that we can draw a more detailed portrait.?

We want to know what exactly is going on with personal data in politics, and at what scale, and use this knowledge to stop shady data practices that break trust and the law, polarise society and damage democracy.

We’ve created an automated online tool that allows you to easily ask all active UK political parties what data they’re holding on you. With a few simple clicks you can discover what parties think about you and who they’ve decided you are.?

Our data requests uncovered some strange and troubling practices. To help you see what your “political data self” might look like, we wanted to share what we’ve learned so far from the three major parties: Conservatives, Labour and Liberal Democrats.?

Access the tool here.

All the main parties are dependent on profiling to decide who should get their message

All three major parties are all collecting personal data and doing some kind of internal scoring to target and/or screen out people.?

The Lib Dems are scoring you on things like how likely you are to vote for Brexit, how much of a “pragmatic liberal” you are, what connection you have to other parties, and whether that means you are likely to swing to the Lib Dems.

?

The Conservatives are giving you a “priority” rating which will determine whether to try to encourage you to vote.

Labour are giving you local area, essentially ranking you in massive local league tables based on where they think you stand on issues like housing, taxation, health, austerity, and Brexit.? rankings in their local areas based on issues like housing, taxation, health, austerity, and Brexit.?

For instance our Scotland Director was ranked 12,966 out of a possible 65,801 in his whole constituency on the issue of tax. The effect of this perhaps more relevant on the doorstep than on digital but it shows the level of granularity Labour are trying to ge to with their profiling.

They are also calculating your connection to other parties and likelihood to swing to Labour.

?

?

?

Labour at one point were also inferring ethnicity, it is unclear whether they are still doing this. This is something we hope to confirm or eliminate through further research.

This “trading and grading” of data is deeply troubling. We think it is going to vary based on where you live, whether in a marginal constituency, or who you are, if you belong to a particular community. This would make sense as some kinds of people and some places are of particular importance to the different parties, and with increased importance will come an increased focus on profiling and scoring.

We will be able to explore this theory with more data from a more diverse range of people. This is one of the reasons we we are so keen for lots and lots of people all over the UK to ask parties what personal data they currently hold.

Parties are profiling us based on where we live and who we live near - and that is not an exact science

The Labour party profiled our Scotland Director, Matthew Rice, as likely to be retired, over-65, childless and owning the flat he was registered to vote from.

?

?

None of this was correct.?

We don’t fully know why Matthew was profiled in this way, but we can make some sensible deductions. A key one is that since address is central to every dataset, Matthew’s profile is tied to that of his neighbours and the prevailing demographic in his area.

Parties are still trying to target every voter, not with particularly narrow data but with big wide data that they then tie to everyone in a given area. This isn’t accurate but it is invasive. Because it is also wrong, it creates further issues down the line in political campaigns for Matthew and others that live in the same area but may have different views, values or lifestyles.

What Labour's deducastions mean ultimately is that they are relying on concocted fictions to make decisions about what messaging to send Matthew, or even whether to include him in a campaign. They’re not only wasting both their time and his, but limiting his opportunity to genuinely engage with what their party stands for and how they compare to other political voices.

Parties are deeply reliant on commercial datasets?

Labour and the Conservatives use Mosaic codes in their voter profiles. Mosaic is an incredibly detailed system of household and individual classification owned by corporate data broker company Experian. It is based on geo-demographic data covering 49 million UK adults

According to Experian’s marketing materials, Mosaic codes contain over 500 variables and segmentations. These are based on offline and online data including email addresses and digital and social media searches.?

Mosaic codes are most frequently used for targeted commercial advertising.

?

Did you know that Experian was selling this information to parties?

?

The Liberal Democrats also use commercial datasets to get their scores. We don’t yet know what data sources they rely on and are hoping to find this out through further research.

The parties believe that using personal data like this is necessary for modern politics

?

?

All three parties are publicly relying on the same legal basis for their individual voter profiling: “substantial public interest”. They say that their processing of personal data is in the public interest and “necessary for democratic engagement”.

Political opinions are known in data protection law as “special category data”. This means that they require a higher level of protection than other more general categories of personal data. In our view, “substantial public interest” should not be relied on to profile political opinions.

We say that to process your political opinions (this might include trying to guess your connection to other parties), parties need to have obtained your consent.?

The problem? No party has ever asked any voter for permission before profiling them.?

We also aren’t so sure that using dodgy consumer data to segment the UK population into more and more narrow and questionable categories to then target us with polarising social media ads is necessary or even particularly healthy for democracy.

What do you think?

If you’d like to know who political parties think you are you can ask them yourself using our new online subject access request tool.

You can also use your result to be part of our UK-wide research. We’re undertaking the biggest mapping of use of personal data by political parties ever, and we need you and lots and lots of others to contribute your results to make the research achieve real impact.

Access our online tool here.

?

[Read more]


November 20, 2019 | Amy Shepherd & Mike Morel

The AdTech showdown is coming but will the ICO bite?

The 2018 General Data Protection Regulation (GDPR) was meant to be a Good Thing - a strong law that would make businesses act responsibly and give ordinary people control over our personal data. But it's been around for more than a year now and we're still being stalked online by creepy ads that seem to follow us around the web and know exactly what we're thinking and doing. So how, exactly, has anything been made better?

Looking at the scale at which online companies grab and hoard our private information, it might be thought that GDPR just gives corporate data thieves a thin veneer of legitimacy. How else to explain the advertising technology (AdTech) industry’s continued use of the same user profiling methods employed by the likes of Cambridge Analytica??

Is GDPR a joke, or can it break AdTech’s vice grip on our personal data? The jury is out, but a verdict is coming.

In September last year, Open Rights Group’s Executive Director, Jim Killock, complained to the UK's Information Commissioner's Office (ICO) against the seismic unlawfulness of AdTech’s ubiquitous real time bidding (RTB) systems. This complaint was made together with Dr. Johnny Ryan of the privacy-focused web browser Brave and Dr. Michael Veale, Lecturer at University College London. Since then, we've worked with a network of privacy activists to spread the complaint to data protection authorities across the EU.?

This is the biggest GDPR complaint so far, and might be the biggest that ever happens.

RTB systems are used virtually everywhere on the Internet to show personalised ads. They broadcast intimate personal data - including everything we've been looking at or searching for online, our exact GPS location coordinates and indicators about our religion, sexuality and ethnicity - billions (yes billions) of times every day to legions of data companies that keep and use this data whether they enter the bid to serve an ad or not.

GDPR gives us the right to demand that companies tell us everything they know about us and delete our data on request. How is that even remotely possible when endless RTB bid requests routinely blast our data out to hundreds of faceless entities? RTB makes GDPR look like a sham.

The ICO offered a glimmer of hope in June this year when it agreed that RTB as currently configured is indeed unlawful under GDPR. Disappointingly, however, despite having the power to issue staggering fines for GDPR violations, the ICO instead gave the industry time to clean up their act while they continue to investigate further.

So what has the industry done with the time? Almost precisely nothing. Google this week announced that it will take a tiny amount of content data out of its bid requests. It heralded this as a big step forward in protecting privacy, but in fact the change does nothing extra to protect individuals, since the vast quantities of other information broadcast continue to be able to identify, profile and target people with stunning invasiveness.?

Tokenistic though it is, however, perhaps this change could still be a sign that RTB players are starting to recognise the status quo is no longer sustainable.?

The ICO said in June that they would review RTB in six months' time. That deadline is coming up fast and the world is watching. It’s the moment of truth not just for GDPR but for the ICO as well. While the Irish Data Protection Commission remains underfunded, the ICO has been adding hundreds of staff, signalling an intent to get serious about enforcement.

What will the ICO do? We don't know. But the AdTech showdown will tell us whether their bark is worse than their bite.

[Read more]


October 17, 2019 | Jim Killock

Age Verification is dead – for now

Compulsory Age Verification for adult content was cancelled yesterday by Nicky Morgan, the Culture Secretary. However, the plans may well come back.

Porn viewing histories for 20m people what could possibly go wrong?Open Rights Group welcomes this change for two reasons. Firstly, the privacy protections within the scheme were merely optional for AV providers, leaving users at risk of having their porn habits profiled, recorded and leaked. Secondly, the plans included the prospect of widespread Internet blocking of non-compliant sites, raising the prospect of the UK becoming the most prolific Internet censor in the democratic world.

Attempting to regulate all Internet content to ensure it is safe for children is, unfortunately, not an achievable aim. Any steps taken will in truth be partial and come at costs. It is very unclear that Age Verification, especially when combined with Internet censorship of legal content, would reach a reasonable balance.

A more reasonable approach to child protection would firstly help parents to use content controls, such as filters, on devices, which can be tailored to suit the child’s development; and secondly it would ensure that children are empowered through education, so that they know how to manage their own risks and make sensible choices.

Whether we like it or not, the simple existence of easily distributed pornographic content means that it will always be within the easy reach of under 18s, whatever controls are put in place. Much like drugs, smoking or drinking alcohol, teenagers will ignore the rules if they wish to. Files are extremely easily swapped and shared. While this does not mean we abandon attempting controls, it does mean we should assume they are not a replacement for open discussion with under 18s.

Government policy we are told will now focus on an forthcoming Online Harms Bill. This may well bring age verification back into the policy mix, now or at a later date. The proposed approach of a “duty of care” to users is vague, and we believe likely to lead to risk-averse platforms over-censoring material, for instance by machine identification, merely to reduce risk. Age verification could easily become a general requirement for platforms, to manage risk more precisely. However, at this stage we are merely speculating about future policy.

For now we hope that Parliamentarians look back at the last two Digital Economy Acts, of 2010 and 2016. Both proposed widespread web censorship, of copyright infringing sites in 2010, and adult content in 2016. Both proposed measures that crashed and burned: the 2010 provisions for disconnecting alleged file sharing households in 2010 died without trace. In 2016, the complex measures to police some, but not all, adult content, now looks like it is gone.

What both acts had in common was that the favoured solutions of particular groups, the copyright holders in 2010, and some child protection and anti-porn campaigners in 2016, were taken by government as the best way to regulate. There was insufficient attention paid to people who pointed out the obvious flaws.

There is an easy way to avoid these kinds of policy failures, which is to discuss the means of reach an objective fully with all parties, including industry and the whole of civil society, before choosing a policy to pursue.

[Read more]


October 07, 2019 | Amy Shepherd

Be the future - 5 simple ways political parties can protect digital rights

It’s October; temperatures are falling, nights are lengthening, and once again we are four weeks away from a potential no-deal Brexit.

No-one fully knows what a post-Brexit Britain will look like. At ORG, we’ve thought about how UK government surveillance operations might have to change. We’ve published briefings anticipating the impact on citizens’ online privacy and free speech. We’ve written to the PM to press for better no deal preparation ensuring the continued flow of personal data between the UK and the EU – essential for businesses, academic institutions and public services.

Digital often isn’t top of the agenda when it comes to UK political commitments. But given the immense power that technology and its deployment by both governments and private companies can have to control and subjugate ordinary citizens’ lives, this urgently needs to change.

In view of the ongoing political uncertainty, it’s increasingly likely that we’re heading to another general election – and soon. We at ORG want to make sure that post-Brexit our elected leaders build a future of digital rights for all. That’s why we’ve put together five core commitments that parties and/or candidates can make – to promise that if elected they will:

  1. Commit to maintaining high data protection and fundamental rights standards, including protecting net neutrality.

  1. Make the Information Commissioner’s Office (ICO) a true watchdog with teeth by increasing resourcing and empowering it to take enforcement action against non-compliant companies and organisations.

  1. Protect consumer rights and vulnerable groups by legislating to grant organisations representative power to defend their fundamental rights.

  1. Be transparent when negotiating international digital trade agreements and not commit to or sign any agreement that may undermine fundamental rights.

  1. Work with individuals and groups from across society to develop digital policy, modelling a new, inclusive and forward-looking way of doing open and collaborative government.

At ORG, we want Britain to capitalise on the potential of digital technology, while becoming fairer, more open and more inclusive. We hope that parties and candidates will take up these calls.

[Read more]