EU: The circling of legislative wagons to better protect consumers
Published in May 2019 by Data Guidance (with a 30 day exclusivity period)
Although major legislative frameworks such as the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) and Directive 2002/58/EC on Privacy and Electronic Communications (‘ePrivacy Directive’) may represent a general approach by the European Commission (‘the Commission’) in its regulation of personal data, the Commission has recognised the multi-faceted and complex reach that data use has, particularly with regard to business practice. This can be acknowledged through the Commission’s publication of various proposals for legislative reform, reports and discussions around uses of personal data relating to areas such as consumer protection, competition, ethics and artificial intelligence (‘AI’). Geraldine Proust, Director for Legal Affairs at the Federation of European Direct and Interactive Marketing (‘FEDMA’), discusses the intricacies of this multi-pronged approach from the Commission.
Have you noticed how data protection is more than the GDPR and ePrivacy? There now exists a deeper understanding of data protection law and how it interacts with ethics. Indeed, European legislators, in their efforts to protect consumers, have developed laws more intricately in this context. Acknowledging this is essential in order for the industry to play a key role in increasing consumer trust and keeping the EU economy competitive. Other legal frameworks, such as consumer and competition laws, are evolving towards upholding data protection and privacy. Major complaints and political circumstances also drive debate around profiling, ethics and the ‘manipulation model.’ We believe that going back to the basics of GDPR and multi-stakeholder dialogue is the way forward.
The cross-sector evolution of European laws
Legal intricacies of the GDPR and consumer law
The Council of the European Union adopted the Directive on Certain Aspects Concerning Contracts for the Supply of Digital Content and Digital Services (Directive (EU) 2019/770) (‘the Digital Content Directive’) on 15 April 2019. The Digital Content Directive aims at providing remedies to consumers in cases where the digital content or service (e.g. social media account) does not comply with the relevant contract. Interestingly, these contractual rights will apply in an equal manner to consumers who provide personal data for such content or services, and to ‘paying’ consumers alike. Personal data is protected by a fundamental right; it cannot be used as a counter performance or traded against digital content or services1. Yet, the provision of personal data triggers the benefit of contractual rights of consumers to have remedies in case of a faulty content or service. This is tricky, as there now exists tension between the law and data driven economy, and between consumer protection and contractual freedom of enterprises. The European Data Protection Board’s draft Guidelines 2/2019 on the Processing of Personal Data Under Article 6(1)(b) [of the] GPDR in the Context of the Provision of Online Services to Data Subjects currently reinforce this delicate situation2. Indeed, personal data necessary for the performance of a contract or service is narrowly interpreted, and any data not necessary for the performance of a contract should rely on another legal basis, such as legitimate interest or consent. However, despite the risk-based approach of the GDPR, reflected in Article 7(4), the extent of ‘free’ consent is still debated, notably in the context of the ePrivacy Directive proposal (the so-called ‘cookie-wall’ discussion). An equilibrium needs to be found, and referral questions to the European courts are foreseeable.
The Digital Content Directive does not apply where:
- the personal data is necessary for the performance of the contract or is a legal requirement and in not processed further;
- the trader only collects metadata such as information concerning the consumer’s device or browsing history, except if provided otherwise at national level; and
- where the consumer is exposed to advertisements exclusively in order to gain access to digital content or a service.
However, contractual law remains the competence of Member States. The Digital Content Directive is without prejudice to the GDPR and, in case of a conflict, the GDPR takes precedence (e.g. processing of data in case of termination of the contract and portability)3. However, the Digital Content Directive has a broader view, including when the trader must refrain from using consumer generated digital content (see Recital 65 of the Digital Content Directive) and what digital content consumers have a right to retrieve (in addition to personal data).
Even though the GDPR takes precedence, delicate situations are likely to arise. Indeed, facts leading to a lack of compliance with requirements provided for under the GDPR, including core principles such as the requirements for data minimisation and data protection by design and default, may, depending on the circumstances of the case, be considered to constitute a lack of conformity of the digital content or digital service with subjective or objective requirements for conformity provided for under Recital 48 of the Digital Content Directive. Monitoring the implementation of the Digital Content Directive over the next two years is a good idea, because of the links with the GDPR, and because the notion of a contract relies on different national definitions in the various Member States.
The Commission’s New Deal for Consumers4 is composed of two proposals; one on the better enforcement and modernisation of EU consumer protection rules, and the other on representative actions for the protection of the collective interests of consumers5. On the first proposal, the European Parliament and Council reached a provisional agreement on 21 March 2019. The European Parliament adopted this provisional agreement in its plenary on 17 April 2019, but at the time of publication, approval from the EU Council of Ministers is still required.
The first proposal provides for the extension of the Directive on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (Directive (EU) 2011/83)6 (‘the Consumer Rights Directive’) to contracts for digital services under which the consumer provides personal data to the trader without paying a price7. Thanks to the Consumer Rights Directive, consumers have a right to withdraw from a contract for digital content or services within 14 days. This applies in a consistent manner with the Digital Content Directive, although Recital 35 of the Digital Content Directive provides that Member States are free to extend the application for the rules of the Consumer Rights Directive to situations excluded from the scope of the Digital Content Directive. Hence the need to monitor the implementation of the Digital Content Directive. The Digital Content Directive also provides a new information requirement to the Consumer Rights Directive to inform the consumer when the price is personalised based on automated decision-making8, stating that ‘[t]his information requirement should not apply to techniques such as “dynamic” or “real-time” pricing that involves changing the price in a highly flexible and quick manner in response to market demands when it does not involve personalisation based on automated decision making. This information requirement is without prejudice to [the GDPR], which provides, inter alia, for the right of the individual not to be subjected to automated individual decision-making, including profiling.’
Regarding representative actions, the second proposal aims to establish a minimum level of harmonisation for representative actions (collective actions, whether for redress9 or injunction), including under the GDPR, thereby limiting the choice provided to Member States by the GDPR10. At the time of publication, the second proposal is still in the decision-making process.
Examples of further intricacies
There is a concern that data is being used by some companies to exploit or exclude competitors from the opportunities of AI and other data-based innovation. The report, Competition Law and Data11, jointly produced by the The German Federal Cartel Office (‘Bundeskartellamt’) and the French Competition Authority (Autorité de la concurrence), states: ‘Two aspects, of particular relevance when looking at data’s contribution to market power, can be identified: the scarcity of data or ease of replicability, on the one hand; whether the scale or scope of data collection matters, on the other.’ According to the European Commissioner for Competition, Margrethe Vestager, what is key is that the Commission is already looking very closely at whether companies are using their control of data to harm competition12. Due to shared concerns around the implications for competition when a handful of companies control data that others need to compete in the market, special advisors to the Commission put forward ideas in a report on what digitisation means for consumers, and how competition law and enforcers should respond to the challenges of the new digital era13. In particular, the report, Competition Policy for the Digital Era14, proposes:
- tougher rules to apply to dominant online platforms which reserve preferential treatment for their own products and services, when compared to other platforms’ vendors;
- data sharing or interoperability remedies for dominant digital businesses, so as to ensure effective competition by disrupting network effects and data-related entry barriers; and
- a reversal burden, whereby it would be up to the companies involved to prove that a deal does not constitute the acquisition of an emerging digital player with a nascent and innovative project which might have become a competitive threat.
A key development in this context is a decision taken by the Bundeskartellamt15 against Facebook. The Bundeskartellamt considered that Facebook had abused its dominant position in the German market of social networks, because “[a]ll data collected on the Facebook website, by Facebook-owned services such as e.g. WhatsApp and Instagram and on third party websites can be combined and assigned to the Facebook user account.” Although there is no financial harm to the consumer, they are considered to suffer from a loss of control over their personal data, and such data being shared.
The European Data Protection Supervisor (‘EDPS’) has raised concerns around individuals being at risk of being lost and defined only by data and algorithms. Consequently, the EDPS proposed the establishment of a Digital Clearinghouse to bring together competition, consumer and data protection agencies to discuss how best to enforce current legislation. All regulators in the digital space, based in the EU or around the world, are invited to take part in the discussions. In December 2018, authorities debated the ‘deceptive framing of a free offer as unfair practice, the opportunity to adopt structural remedies able to provoke a change in the business models, asymmetric regulation of access to data and its impact on competitive dynamics, essential facility theory applied to the specificities of data resources and misuse of the data protection framework to obstacle investigations by national authorities including competition agencies16.’ The authorities agreed ‘to continue discussions on developing a methodology to assess the real costs of where the monetary cost of services is zero or below marginal cost17.’
Staying GDPR focused
‘Surveillance or manipulation model,’ profiling, and ethics
The EDPS published an Opinion on online manipulation and personal data18 (‘the Opinion’), which voices concern “with the way personal information is used in order to micro-target individuals and groups with specific content, the fundamental rights and values at stake, and relevant laws for mitigating the threats.” Some authorities, such as the UK Information Commissioner’s Office (‘ICO’), are particularly active in this area too19. Due to approaching national European elections, the Commission published in September 2018 an electoral package including a Code of Practice on Disinformation20. Moreover, the Commission has a High-Level Expert Group on AI, which in April 2019 published its Ethics Guidelines for Trustworthy AI21.
At the CDPD 2019 conference22, academics discussed the right to fair or reasonable data inference in the context of profiling. Activists called on consumers to use their data subject access rights to request access to inferred data. In some cases, data subject access rights are leading to major complaints, such as those made by None Of Your Business (‘NOYB’). Most of these cases challenge the legal basis (whether consent or legitimate interest) to process data for profiling purposes. The recent decision by the French data protection authority (CNIL) regarding Google23, in response to one of the NOYB complaints, focused on the principles of minimisation and transparency under the GDPR, and its risk-based approach. One could remain optimistic, as better perception by the consumer of the value exchange will reinforce protection of individuals and consumer choice24. Across all countries, control, trust and transparency form the foundational basis for a healthy data economy. FEDMA states that 88% of consumers cite transparency as the key to trusting organisations25, therefore improving transparency and control for people will help companies be in a much stronger position to engage them within the data economy. This strikes at one of the core principles of the GDPR: ‘accountability.’
Back to the basics – the GDPR’s risk-based approach and accountability
The GDPR provides that data processing must be fair and that controllers are accountable. A risk-based approach allows for flexibility and for market diversity too, among both paid and free services. Processing is fair when the controller asks itself “what if this was my data, would I consider the processing fair? Would I reasonably expect this processing to take place?” Additionally, effective transparency drives consumer trust. A risk-based approach would acknowledge consent as one legal basis, whilst legitimate interest combined with effective transparency can also benefit the consumer26. The possibility for a controller to process personal data for its legitimate interest must be the result of a legitimate interest assessment. This assessment must go through three key stages:
- identification of the legitimate interest;
- carrying out a necessity test; and
- carrying out a balancing test.
The balancing test must always be conducted fairly, taking into account the nature of the interests (e.g. the reasonable expectations of the individual), the impact of the processing (e.g. the positive and negative impacts on the individual) and any safeguards which are or could be put in place (e.g. data minimisation, de-identification, technical and organisational measures)27. When carrying such assessments, profiling for advertising purposes, unlike profiling for political purposes, does not have a legal effect on an individual28. It is important to remember that “nothing is wrong with data and digital29.” Our industry aims at a constructive dialogue with the European institutions and authorities. FEDMA, the only European association with an approved Code of Conduct by data protection authorities at the time of publication, is updating its Code of Conduct on processing of personal data for direct marketing purposes.
To conclude, I encourage you to follow the new European Parliament and the Commission, and their upcoming programmes to legislate on the digital economy. The Directive on Certain Legal Aspects of Information Society Services, in Particular Electronic Commerce, in the Internal Market (Directive (EU) 2000/31/EC) (the E-commerce Directive) is very likely to be reviewed.
Geraldine Proust Director for Legal Affairs
gproust@fedma.org
FEDMA, Brussels
Notes:
- “Fundamental rights such as the right to the protection of personal data cannot be reduced to simple consumer interests, and personal data cannot be considered as a mere commodity,” Opinion 04/2017 on the draft Digital Content Directive, the European Data Protection Supervisor, 14 March 2017, and Recital 24 of the Digital Content Directive.
2. ‘the concept of what is necessary for the performance of a contract is not simply an assessment of what is permitted by or written into the terms of a contract. The concept of necessity has an independent meaning in the EU law, which must reflect the objective of data protection law.’
3. Recital 37 and 38 of the Digital Content Directive.
4. Commission press release, available at: http://europa.eu/rapid/press-release_MEMO-18-2821_en.htm
5. At the time of writing, the Proposals are currently being examined by the European Council. The modernisation of consumer law is considered a priority, but the proposal on representative actions is progressing slowly and only a progress report is expected by the end of the current Presidency.
6. Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32011L0083
7. The Consumer Rights Directive already applies to contracts for the supply of digital content which is not supplied on a tangible medium (i.e. supply of online digital content) regardless of whether the consumer pays a price in money or provides personal data.
8. See Recital 45 and Article 2(4) of the Digital Content Directive.
9. For example, the Belgian Consumer Association has put forward a representative action calling for redress (e.g. €200 per Facebook user in Belgium) from Facebook for sharing data to apps without individual’s consent.
10. Article 80 of the GDPR.
11. Available at: http://www.autoritedelaconcurrence.fr/doc/reportcompetitionlawanddatafinal.pdf
12. Vestager’s speech of 4 February 2019, available at: https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/making-data-revolution-work-us_en
13. Vestager’s speech of 8 February 2019, available at: https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/innovative-digital-future_en
14. Available at: http://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf
15. Decision of 7 February 2019, available at: https://www.bundeskartellamt.de/SharedDocs/Meldung/EN/Pressemitteilungen/2019/07_02_2019_Facebook.html;jsessionid=C68DCDF0DFFD5353FE497FB1AAC43195.1_cid387?nn=3591568
16. Digital Clearinghouse statement, available at: https://edps.europa.eu/sites/edp/files/publication/18-12-10_4th_dch_statement_en.pdf
17. Ibid.
18. EDPS Opinion 3/2018, available at: https://edps.europa.eu/sites/edp/files/publication/18-03-19_online_manipulation_en.pdf
19. Democracy Disrupted? Personal information and political influence, published by the ICO on 11 July 2018, available at: https://ico.org.uk/media/2259369/democracy-disrupted-110718.pdf
20. Available at: https://ec.europa.eu/digital-single-market/en/news/code-practice-disinformation
21. Available for download at: https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=58477
22. See https://www.cpdpconferences.org/
23. Available at: https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-penalty-50-million-euros-against-google-llc
24. See https://www.fedma.org/wp-content/uploads/2018/05/Global-data-privacy-report-FINAL.pdf
25. See https://www.fedma.org/2018/02/88-uk-consumers-see-transparency-key-increase-trust-sharing-data/
26. A FEDMA paper on legitimate interest is available at: https://www.fedma.org/wp-content/uploads/2018/08/20180814-FEDMA-The-need-for-legitimate-interest-FINAL.pdf
27. Data Protection Network guidance available at: https://www.dpnetwork.org.uk/dpn-legitimate-interests-guidance/
28. Article 29 Working Party, Guidelines 251/Rev1 on Automated individual decision-making and Profiling for the purposes of the GDPR, available at: http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053
29. Quote from the ICO at an EDPS workshop on how to unmask and fight online manipulation, available at: https://www.youtube.com/watch?v=E6l1g3WoMRw