Author: Marco Almada, Juliano Maranhao and Giovanni Sartor
- Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.
- The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.
- An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.
I. General overview
Art. 25 introduces two general data protection requirements: data protection by design (Art. 25 para. 1) and data protection by default (Art. 25 para. 2). Data protection by design requires that data controllers adopt appropriate technical and organizational measures and necessary safeguards to implement data protection principles, protect the rights of data subjects, and meet the requirements imposed by the GDPR. Data protection by default requires that data controllers adopt measures to ensure that each processing operation is limited to what is necessary, under normal circumstances, to the purposes of the processing, as long as no justified specific initiative to the contrary is adopted.
The two principles are connected; and indeed, data protection by default has been viewed as a specific aspect of a proactive/risk-prevention approach to data protection, often identified under the term “privacy by design.”[1] Such principles are based on the idea that data protection should be built into the very structure of information systems, the latter being understood as sociotechnical systems, in which machines and humans are integrated through organisational arrangements. This explains why the measures at stake may be technical, such as pseudonymisation or anonymisation, as well as organisational, such as the adoption of specific training for personnel involved in processing operations. Both principles are based on the idea that the functioning of an information system – and, in particular, the way in which it affects data subjects – primarily depends on its architecture. Effective protection can only be guaranteed if risk prevention measures are adopted during design and deployment.
Even though the same fundamental idea underlies data protection by design and by default, a difference emerges from the text of Art. 25. Art. 25 para. 1 explicitly requires controllers to consider the risks that processing may pose to data protection principles and to adopt appropriate measures to effectively respond to those risks. Appropriateness depends on the likelihood and the severity of a risk, but also on the state of the art, the possible costs of implementing a candidate measure and the circumstances of the processing. Therefore, a technical or organisational measure, which would further reduce risks, may not be required by design if its costs are disproportionate relative to the risks it could prevent and the system still provides an acceptable level of risk prevention in the absence of such a measure.
Contrary to Art. 25 para. 1, Art. 25 para. 2 makes no reference to an evaluation of data-minimising measures in light of the risks stemming from the processing and the costs of measures for countering such risks. This is due to the fact that Art. 25 para. 2 assumes that an arrangement exists that adequately achieves the purposes of the processing while minimising the use of personal data, i.e., an arrangement under which personal data are processed only to the extent that is necessary for the satisfactory achievement of such purposes, under normal circumstances. Thus, it unconditionally requires that the system be preset by default according to this data-minimising arrangement, though this default may be overridden depending on users’ preferences. For instance, when personal data are processed in the context of e-commerce for the purpose of implementing contracts between suppliers and customers, the system may legitimately be prearranged in such a way that it processes consumers addressees as needed to deliver purchased goods. On the other hand, it should not by default process customers’ data for sending them targeted recommendations. Such further processing would only be legitimate after customers freely consent to it, overriding the default restriction of the processing to the data that are necessary for implementing the contract.
Data protection by design and by default requires that controllers adopt measures and safeguards that implement data protection principles and protect data subject rights. An active behaviour is required by controllers. They must ensure that the design of the processing and its default settings do not cause avoidable risks. Thus, controllers have the positive obligation to suppress avoidable risks according to state-of-the-art technical and organizational measures. This obligation of controllers does not provide data subjects with claims that specific technical and organisational measures be adopted by controllers, but failure to comply with Art. 25 may lead to a violation of data protection rights or harm to the data subject. Noncompliance with Art. 25 can be detected in various ways: through the exercise of data subjects’ rights – e.g., data subjects’ right to access their data or to contest an automated decision (Art. 22 para. 3) –, through stand-alone evaluation procedures, such as DPIAs (Art. 35), or through the exercise of the investigative powers of DPAs (Art. 58).
[…]
[1] Cavoukian, ‘Privacy by Design: The Definitive Workshop’, Identity in the Information Society 3, no. 2 (2010), 250.