The E-Commerce Directive imposed very few obligations on intermediary service providers. Basically, under the liability rules, it only said what these entities should do if they became aware of infringing information in the case of caching and hosting providers12, and these liability rules were changed only minimally (see below), and the lack of a general monitoring obligation as a main rule was maintained.
The two simple obligations of the E-Commerce Directive (the general obligation to provide information and the specific rules on commercial communications) remain unchanged for intermediary service providers (but the Directive does not require compliance by intermediary service providers, but by all providers of information society services).
However, the conditions under which online service providers are obliged to deal with unlawful content have been significantly extended. For all intermediary service providers, there is now a clear expectation that they must comply urgently with requests from other Member States, even from courts or authorities, through national coordinators, with regards to blocking or reporting.
This in itself will tie up considerable resources for service providers – including if they wish to check whether the authority is entitled to access the requested data or to request blocking.
The general obligations of the E-Commerce Directive on provision of data will be extended by requiring intermediary service providers to maintain and publish a separate contact point for public authorities and another one for users. In addition, it is not sufficient to designate a contact point that is only an automated device, such as a chatbot13.
The fact that the DSA has also set its own requirements regarding the content of their terms and conditions in order to protect users is expected to significantly disrupt the operations of most intermediary service providers. The new requirements relate in general to the disclosure of information about restrictions on use (including possible moderation, automated decision making and complaint handling), but also include an obligation to notify users of unilateral changes. We will see in the next section that even more detailed obligations apply to business platforms.
Even the smallest hosting providers will be subject to a number of new obligations.
The first set of obligations will detail the procedure for dealing with content that is suspected to be infringing, how to report it and how to make the investigation process more transparent. As the obligation to receive notifications will apply to a large number of service providers, and as the Regulation sets out detailed technical conditions, it is expected that in practice a standardised notification interface and service will be developed, similar to the cookie acceptance mechanisms.
As part of ensuring transparency, hosting providers will have to explicitly indicate if the processing of notifications is automated in any way – i.e. this is not only an obligation for automated decision making, but also information on any form of automated processing. However, the Regulation does not set a specific deadline for the processing of notifications.
Also to ensure the protection of users and the transparency of service providers’ decisions, the Regulation imposes a requirement to give reasoned notice (information) to users on the restriction of certain uses.17 The rules on the justification are described in relative detail in the Regulation, hoping to limit the narrow-mindedness of service providers’ decisions, which may be sensitive to users.
Finally, the Regulation imposes a reporting obligation on hosting providers in the event that they have information that suggests that someone has committed a “crime that threatens the life or safety of a person.”14
It can be seen that if an intermediary service provider wishes to comply with all of these obligations, it will incur non-negligible costs – even though the EU impact assessment found this to be a negligible cost.15
For small businesses, the range of obligations specifically applicable to online platform will be minimal, as most of the provisions are exempted by the DSA.16
Thus, for online platforms, the bulk of the obligations are the same as those applicable to hosting providers, with two additional additions.
One addition relates to the category of online marketplaces within the scope of online platform providers: a provider cannot be exempted under the intermediary exemption if it appeared to an average consumer that the order in question was fulfilled by the marketplace itself or by an operator controlled by it.17 Online platform providers are generally interested in reshaping the balance of power between their different categories of users to their own advantage. For example, if a customer in a local service market is satisfied with the service of a provider (e.g. a competent electrician). However, the platform that dominates access to that local service may not be interested in the customer calling that provider directly next time, even if only because of its share loss, and therefore the platform may design its service to minimize the relevance and identifiability of the actual provider to customers. It is this ‘disruptive’ tendency that this rule is intended to curb (albeit in the case of products rather than the services in the example).
The other addition is an obligation to provide data, where they will have to provide their six-month average number of users, according to a methodology to be defined in future legal acts, at the request of the Member State coordinator.
It is also important for our purposes to understand the little known “online intermediation service”18 In practice, an online intermediation service will necessarily be an intermediary service as well (and also an online platform) under the DSA, but the definition of the term is not using the term intermediary. The reason is that this concept predates the Digital Services Act. For some strange purposes, the Commission has not sought to align the terminology of these two regulations, only referring in the draft DSA to the P2B Regulation as a “lex specialis”.
The concept of online intermediary services and the P2B Regulation were issued to allay fears about platforms designed for commercial “re-use”. Thus, only services aimed at business users are covered, where the “platform service” itself is intended to facilitate transactions between business users and consumers.19
Thus, the P2B Regulation does not cover platforms of a consumer nature (typically social networks, TikTok, etc.), whereas the DSA’s online platform term covers these consumer-to-consumer platforms as well.
The P2B Regulation, although narrower in scope than the Digital Services Act, contains more detailed and prescriptive rules. It is difficult to see how the Regulation applies in practice, even though it has been applicable since July 2020. In Italy, the competent authority has already published a guidance document following a public consultation,20 but in other countries (such as Hungary) it is not known which authority will act on any related issues. Only three out of 27 Member States have so far taken the trouble to designate an authority or organisation representing the interests of business users under the P2B Regulation. The Commission should also have sent its report to Parliament on the experience in applying the P2B Regulation more than a year ago – the draft report is already available, according to public procurement reports, but it is not yet public.
Let us briefly review how the obligations of the P2B Regulation differ from those of the DSA.
Compared to the DSA’s general obligations on contract terms, the P2B Regulation imposes more detailed obligations. In particular, it is not sufficient to inform users of (unilateral) changes to the terms and conditions, but it also sets a minimum time limit for the amendment. Apart from narrow exceptions, the time period for the amendment must allow the business users to adapt to the notified change (but not shorter than 15 days). The terms and conditions must also provide information on how the use of the platform affects the intellectual property rights of the business user and on the additional channels through which the platform provider will sell products also offered by its own user.
The conditions relating to the restriction of service are also much stricter. While under the DSA, only the general restriction conditions applicable to hosting providers are imposed on small business services (see point 5) and small businesses are exempted from the more detailed rules on the restriction of abusive uses21, under the P2B Regulation, small businesses do not benefit from such relief. They must provide in their contractual terms and conditions details on how they will deal with manifestly infringing content, notify their users of the restriction on a durable medium at the latest at the time of the restriction and allow their users to seek clarification by means of a complaint.22 Regardless of the suspension, the time limit for terminating the provision of the service must not be less than 30 days.
Both online search providers and online platform providers are obliged to publish an explanation of their ranking of users under the P2B Regulation,23 and small businesses are not exempt from this obligation (whereas they are exempt from this in the DSA).24
Finally, it is only under the P2B Regulation that small businesses are required to provide information to their users in their contractual terms and conditions on how they can access their data entered or generated in the course of providing services.25
Given that the P2B Regulation and the DSA Regulation will affect a significant number of service providers even at national level, it would be appropriate to intensify the dissemination of information on these two EU legal acts. The actual application of the P2B Regulation is still pending in many countries, including Hungary, despite the fact that the deadline for its application has been overdue for two and a half years. It is likely that those business users affected are not even aware that their rights in this area are not being respected, that the necessary information is not being published. Member States and authorities, with few exceptions, are not helping the situation.
As the date of 17 February 2024 approaches, it is increasingly likely that the impact of these two regulations will only become more widely known by users and service providers alike. However, the later small businesses start to prepare for these - not insignificantly costly - changes, the more expensive the cost of compliance will be and the less opportunity there will be to implement coordinated, cost-effective IT and legal solutions.
(Last updated 28 January 2023)
[^2] Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (P2B Regulation), https://eur-lex.europa.eu/eli/reg/2019/1150/oj.
REGULATION (EU) No 2022/2065 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 19 October 2022 on the Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act), https://eur-lex.europa.eu/eli/reg/2022/2065. ↩
Commission Recommendation of 6 May 2003 concerning the definition of micro, small and medium-sized enterprises, http://data.europa.eu/eli/reco/2003/361/oj, for Hungary, see the definition as part of the Act XXXIV of 2004 on small and medium-sized enterprises, on the support for their development. ↩
These exemptions under the DSA do not apply to medium-sized enterprises. ↩
Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market. ↩
See Annexes V to VI of Directive 98/34/EC of the European Parliament and of the Council of 22 June 1998 laying down a procedure for the provision of information in the field of technical standards and regulations (as amended by Directive 98/48/EC). ↩
DSA (28)-(29) preambular paragraphs. ↩
see SWD(2020) 348 final Commission Staff Working Document: Impact Assessment Report Accompanying the document PROPOSAL FOR A REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC, 15 December 2020, https://eur-lex.europa.eu/resource.html?uri=cellar:5ebd61c9-3f82-11eb-b27b-01aa75ed71a1.0001.02/DOC_2&format=PDF and Sebastian Felix Schwemer, Tobias Mahler and Håkon Styri, ‘Legal Analysis of the Intermediary Service Providers of Non-Hosting Nature’ (1 July 2020) https://papers.ssrn.com/abstract=3798494, accessed 21 January 2023. ↩
DSA (recital 29) ↩
Act CVIII of 2001, § 2(l). ↩
Judgment of the Court of Justice of the European Union of 12 December 2006 in Joined Cases C-236/08 to C-238/08 Google France and Google v Vuitton, ECLI:EU:C:2010:159. ↩
Zsolt Ződi, ‘Characteristics of the European Platform Regulation: Platform Law and User Protection’ (2022) 7 Public Governance, Administration and Finances Law Review, p. 91 https://folyoirat.ludovika.hu/index.php/pgaf/article/view/6319, accessed 17 January 2023. ↩
articles 13(1)(e), 14(1) and (2), 15. ↩
recital (43) and Article 12(1). ↩
Article 18 ↩
SWD(2018) 138 final COMMISSION STAFF WORKING DOCUMENT IMPACT ASSESSMENT Accompanying the document Proposal for a Regulation of the European Parliament and of the Council on promoting fairness and transparency for business users of online intermediation services, point 6.2, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52018SC0138. ↩
articles 19\ and 29. ↩
Article 6(3). ↩
The term used in the P2B Regulation is "online intermediation service", whereas the term used in the E-Commerce Directive and the DSA is intermedi[ary]{.ul} service. The first is an online intermediary service provider, the second an intermediary service provider – although all intermediary service providers are necessarily online under the Electronic Commerce Directive and the DSA, and the P2B Regulation bases its definition of online intermediation service on the ITOS, not on the term "intermediary service" in the Electronic Commerce Directive. ↩
ibid., Article 2, point 2. ↩
E.g. for airbnb, booking, Shopify we found such provisions following from the P2B regulation, but not for e.g. eBay and Amazon online store. ↩
DSA Article 23 ↩
Article 4 of the P2B Regulation ↩
See on this the Commission Communication: Guidelines on ranking transparency pursuant to Regulation (EU) 2019/1150 of the European Parliament and of the Council, https://eur-lex.europa.eu/legal-content/HU/TXT/HTML/?uri=CELEX:52020XC1208(01) ↩
Article 5 of Regulation P2B and Article 27 of DSA ↩
P2B Regulation Article 9 ↩
1. Introduction
On 7th October, the President of the United States of America, Joe Biden has issued a new executive order: U.S. Signals Intelligence Activities; Efforts To Enhance Safeguards (EO 14086). The reason for issuing this new executive order is to use it as a replacement for the Privacy Shield Program (EU-US Privacy Shield Framework) which was invalidated by the Schrems II decision (C311/18). This new executive order is the US side implementation of the new “EU-U.S. Data Privacy Framework”, however, to enter into effect, a new adequacy decision has to be issued by the European Commission, similar to the invalidated Commission Implementing Decision (EU) 2016/1250.
Many European data controllers (and their lawyers) are looking forward to such a new adequacy decision, because the Schrems II decision and the recommendation 1/2020 of the European Data Protection Board (EDPB) made it very difficult for companies under the GDPR to comply with the strict requirements of Article 46-47 when transferring any personal data from the EU to the US.
Although hypothetically, standard contractual clauses and binding corporate rules were not affected by the Schrems II decision, recommendations of the EDPB made even such transfers a lot more riskier in terms of compliance for data controllers. Data exporters relying on the standard contractual clauses of the European Commission or of previously approved binding corporate rules were suddenly expected to carry out analysis of whole legal systems (e.g. USA, China, Russia etc.) as being operated in practice and to implement supplementary technical and contractual measures to bring up national laws “to the level required by EU law”.
First, for the European Commission, it takes an average of 28 months to make such an analysis, but suddenly all SMEs in the EU are expected to do this before allowing any such transfers to the US or having the resources to make a lawyer such a bold statement about the data protection law of e.g. 50 different states.
Secondly, if we take a look at the reasons underlying the invalidation of the Privacy Shield Program (see below), we can see that this is easier said than done, and it might theoretically be impossible for data controllers to fully patch these existing holes up.
Recent decisions of national data protection authorities have further exacerbated this problem, when they have started to prohibit any data transfer to the US of popular services such as Google Analytics.
Having a new adequacy decision will remove all this uncertainty, because data protection authorities will not be able to simply prohibit data transfer based on their own suspicions, until someone first convinces once again the EU Court of Justice of the illegality of the future adequacy decision of the Commission.
Below, we would like to first recall the reasons why the EU Court of Justice found the Privacy Shield adequacy decision invalid in Schrems II, then provide a summary of why this Privacy Framework is (hopefully) different.
2. Schrems I on Safe Harbour and Schrems II on Privacy Shield
Following the revelations of Snowden, the adequacy decision of USA data transfers under the safe harbour principle (2000/520/EC) had been questioned even by the Commission itself. However, the Commission opted for trying to strengthen the principles rather than revoke it (COM/2013/846), but these strengthening measures remained mostly recommendations in nature (e.g. aiming for new umbrella agreements for law enforcement purposes, expecting better supervision by US authorities of the self-certified companies, limiting the national security exceptions to what is proportionate, extending safeguards of US residents to all EU citizens etc., see also COM/2013/847).
However, in the same month as the revelations, a private person submitted a complaint to the Irish data protection authority (Commissioner) regarding the transfer of a social media company of his data to the USA. After the complaint was rejected, the complainant brought an action before the High Court, which referred the case to the Court of Justice of the EU. The direct question was whether data protection authorities were bound by the findings of the European Commission (2000/520/EC) or could they examine in detail whether the safe harbour decision ensured adequate protection in line with Articles 7, 8 of and 47 the Charter of Fundamental Rights of the European Union.
Regarding the direct question, in the case C-362/14 (Schrems I), the Court of the Justice of EU has answered that such data protection authorities can examine such claims of non-compliance, but they cannot adopt measures contrary to that decision of the Commission. However, the Court of Justice went further, and in order to give the referring court a full answer (C-362/14 para 67), they have implied that it was also necessary to examine whether that 2000/520/EC decision complies with the requirements of the data protection directive in effect at the time (95/46/EC). In that part, the Court of Justice has highlighted that (a) the safe harbour principles were intended for self-certified organisations, (b) national security and similar requirements’ have primacy over the safe harbour principles, with no limitations in such interference and without any safeguards for (non-US resident) EU citizens against such interference.
Under the case law of the Court of Justice, interference with the fundamental rights of Article 7 and 8 require legislation to first, lay down clear and precise rules governing the scope and application of a measure and imposing minimum safeguards, and secondly, to be limited what is strictly necessary. Furthermore, Article 47 requires that individuals have the right to an effective remedy before a tribunal, and should be able to pursue legal remedies in order to have access to personal data relating to him, or to obtain the rectification or erasure of such data.
After the invalidation, the High Court of Ireland had referred the issue back to the Irish data protection Commissioner, whose investigations found that Facebook was now using the standard contractual clauses (“SCC”) as published by the European Commission for data transfers. For that reason, the Commissioner have also asked the complainant to reformulate his question.
After the invalidation of 6 October 2015, the European Commission has adopted in a short time a new adequacy decision (Commission Implementing Decision (EU) 2016/1250), which explicitly contained provisions on personal data use by US public authorities as well, including limitations for the use by national security purposes (such as Presidential Policy Directive 28) and references to individual redress (cf. 111-124 of CID (EU) 2016/1250). The US government has committed to create a new oversight mechanism, independent from the US “Intelligence Community”, called the Privacy Shield Ombudsperson.
The reformulated complaint of December 2015 now also covered the SCC mechanism for data transfer, and claimed that even these measures were not in line with Articles 7, 8 and 47 of the Charter. Now, the Commissioner was also of the same opinion, that personal data of EU citizens would be processed by the US authorities in a manner incompatible with Article 7 and 8 of the Charter, and that US law still did not provide citizens with legal remedies compatible with Article 47 of the Charter. The Commissioner has found that the SCCs are not capable of remedying this defect, since they confer only contractual rights on data subjects against the data exporter and importer, without, being binding the United States authorities.
As per the request of the Commissioner, the High Court of Ireland again referred the question to the Court of Justice, highlighting their finding that non-US persons are still not granted effective (personal) remedies (neither under FISA Section 702, that is, 50 U.S. Code § 1881a and nor by the presidential policy directive above, PPD28). Such persons do not have access to litigation based on Fourth Amendment of the US Constitution, and in relation to surveillance measures, it is almost impossible for the claimants to prove that they have a sufficient interest in such matters (locus standi) for the court to decide the merits of the dispute. Furthermore, the High Court has was also of the opinion that the newly introduced Privacy Shield Ombudsperson cannot be considered as a tribunal under Article 47 of the Charter.
In this next decision, C-311/18 (Schrems II), the Court of Justice also examined the validity of the Privacy Shield decision, again, “in order to give the referring court a full answer” (161), and found that invalid.
The main reasons for such invalidity were based on not complying with neither the proportionality of the limitation under Article 52.1 of the Charter, nor with requirements of Article 47 of the Charter for having a public hearing by an independent and impartial tribunal.
The first condition was not met because surveillance programs under Section 702 of the FISA or E.O. 12333 (even with PPD28) do not contain limitations on such implementations of surveillance programmes for non-US persons, and, the US admitted that even PPD28 does not grant any actionable rights for data subjects against US courts.
The second condition was not met because submitting a claim to the Privacy Shield Ombudsperson was not found to be equivalent to bringing legal action before an independent and impartial court (in order to have access to their personal data, or to obtain the rectification or erasure of such data). The political commitment by the US Government was that any element of the intelligence services is required to correct violations detected by the Ombudsperson, but the Court of Justice has found that there no legal safeguards have been implemented to fulfil this promise. The Ombudsperson does not have the power to adopt binding decisions on intelligence services and they are not independent from the Secretary of State.
3. Is it really different this time?
Let’s now take a look at the new Executive Order. In its section 2, it provides wide-ranging principles for signals intelligence activities, regarding their legal basis, and stresses that they are always subject to appropriate safeguards, proportionality etc. It also gives an explicit list of objectives (which may later be updated of course), and also a list of prohibited objectives, including “suppressing or restricting a right to legal counsel” or collection of trade secrets for the US companies to gain competitive advantage. Regarding the privacy safeguards, somewhat generic safeguards are clearly present in section 2(c) requiring a prior determination for a specific signals intelligence collection activity and the necessity for such collection to advance an intelligence priority, and elements of proportionality (“shall consider the availability, feasibility, and appropriateness of other less intrusive sources and methods for collecting the information necessary”, “feasible steps taken to limit the scope of the collection to the authorized purpose”.)
With regard to bulk collection, more detailed requirements are included (section 2(c)(ii)), including finding that the same purposes cannot be accomplished with target collection, and that “reasonable methods and technical measures” should be applied “in order to limit the data collected to only what is necessary”. For bulk surveillance, the list of objectives is shorter, but it still included such widely applicable objectives as “protecting against cybersecurity threats created or exploited by … a foreign government, foreign organization, or foreign person”.
Besides these safeguards, a definite redress mechanism is also included that covers all signal intelligence activities (section 3). The redress mechanism has two levels. The first level of redress starts by the receipt of “qualifying complaints” sent by public authorities in a qualifying state. However, this doesn’t mean that only public authorities may submit such complaints, this rather means (under the definitions) that public authorities of qualifying states (including those of the EU) will first have to verify the identity of the complainant and that the complaint satisfies the requirements of the EO.
This complaint will be investigated by the Civil Liberties Protection Officer (CLPO) of the Office of the Director of National Intelligence in the US, who is entitled to have access to all the necessary information from the intelligence communities, and is expected to document the whole review including factual findings, and create classified reports. There are now explicit provisions in the EO that the decisions of the CLPO will be binding on the intelligence community and its agencies. Independence of this CLPO from the Director of National Intelligence is also clearly stated in the EO, including prohibition of removal of the CLPO outside misconduct etc.
It’s interesting to highlight that the complainant will not receive any information regarding the intelligence activities it was subject to, including whether they were at all subject to such intelligence activities - the reply will only state whether they have identified any violations and issued remediation or not.
Either the complainant or the intelligence community element may apply for a second level of review, a review by the newly established “Data Protection Review Court”. This court will be made of judges who were not employed by the government at the time of their selection, and may not be removed once appointed, in accordance with general rules for judicial conduct. The court shall operate in panels made of three judges and will mostly work based on the complaints and the documentation prepared by the CLPO. Similar to the CLPO, this court is also limited in what it can inform the complainant about.
In summary, we can say that in contrast with the Privacy Shield and its annexes of letters, the new EO has really clarified the legal basis and the safeguards for such intelligence activities. However, against all the best intentions and all efforts of the European Commission in a new adequacy decisions, we may still get into a situation that the Court of Justice still finds these safeguards insufficient. It is not trivial how the above redress mechanism in the new EO can ensure access to data by the data subjects as required by Article 8.2 of the Charter or whether such short answers by the CLPO/DPRC will amount to a right to an effective remedy and a fair trial.
But we also have to keep in mind some other aspects:
a) this kind of access to classified data is not unconditionally ensured under national law in the EU either with regard to intelligence activities;
b) compared to the lack of clear definition of national security in the EU countries (see CCBE Recommendations on the protection of fundamental rights in the context of ‘National Security, p. 12-16.) and the lack of transparency of such activities in the EU, the attempt of this EO of the US at giving a list of objectives for signals intelligence activities and providing safeguards is clearly a step in the right direction.
Even if the EU is not a federal state like the US, from the viewpoint of the protection of fundamental rights as protected by the Charter, at least a more modest degree of harmonisation and transparency could be ensured with regard to limitations related to national security as well.