Introduction

I. History and Motives of Data Protection Legislation

 

Authors: Spiros Simitis († 2023), Gerrit Hornung and Indra Spiecker gen. Döhmann

1. Beginnings

a) Legal origins and terminology

The history of data protection legislation begins on 30 September 1970, the day the Data Protection Act was passed in the federal state of Hesse in Germany.[1] The Hessian Data Protection Act was the first data protection law in the world. Even before it was passed, there were demands for legal regulation, especially in the USA, likewise in the light of the automated processing of personal data. Unlike in Hesse, however, the expectations were directed at specific fields of data processing and in particular at the credit sector. It was therefore not surprising that the Fair Credit Reporting Act was passed only a few days after the Hessian Data Protection Act, on 26 October 1970.

The first step in the history of data protection legislation was thus taken.[2] Various legislators in the EU Member States subsequently summarised their ideas on regulating the handling of personal data and automated processing, mostly using the expression “data protection”, i.e. the term that had first appeared in the preparatory work for the 1st Hessian Data Protection Act[3] and that has since become established far beyond the Federal Republic. Admittedly, the choice of words is not exactly fortunate. It suggests that the legislator’s intention in intervening is merely to take the necessary precautions to protect the data. However, this is not the case. The only thing the law seeks to do is to ensure protection against the consequences that the processing of personal data using automated processing may have for the data subjects, though by no means only for them.

The willingness to use a term[4] that is obviously misleading and has rightly been criticised time and again but is now used practically worldwide (even if not with a uniform interpretation), is not surprising. In any case, it is understandable as soon as it is realised that the demand for binding, legally defined processing conditions arose under the influence of automation and was therefore oriented towards the special features of automated processing (→ mn. 6 et seq.). From this point of view, it was obvious to concentrate on the technical development and to look for technical and organisational answers for every question arising around it, in the sense of “data security”. The realisation that technical-organisational measures were not sufficient to guarantee the protection of those affected was only slowly gaining ground.[5] Instead, it was necessary to focus on legislative requirements. Only they, and not technical precautions, could determine whether and which data may be processed, how and under what conditions. However, because legal requirements were perceived, at least initially, merely as an additional aspect of a “data security” generally aimed at warding off external influences on processing, it was considered more or less self-evident to call it “data protection”, to use a term that was already familiar from the discussions on data security and therefore clearly indicated the connection to the latter. Even today, IT security is closely linked to data protection, albeit under different auspices (see above all → Art. 32).

“Data protection” rapidly developed into a separate field of regulation, and despite the doubts that regularly flared up, it remained “data protection”. As a term, while in line with the linguistic style of an early phase of discussion about the necessity of legal regulations for the processing of personal data, it also and especially expresses the desire to overcome the decidedly technical regulatory approach typical of this phase. The use of “privacy” as a protected good of data protection and equating the two is a not very fortunate.[6] It fails to recognise the general concerns of data protection, reduces data protection to a sub-area of the use of personal data, namely data from the “private” sphere, and ultimately obscures the completely different origin and character of the European concept from that of the US.

b) Reasons for and motives of the early legislation

Four aspects determined legislative intervention in the early days of data protection, without regard to the sometimes considerable differences between the individual laws. In principle, these still motivate data protection law today, both at national and at European level:

aa) Automated data processing

Firstly, with its decision to establish binding rules for the handling of personal data, the legislator reacted consistently to the radical change in information technology, or more precisely, to the automation of data processing. Since the mid-1960s at the latest, it had been clear to both public administration and private companies that their ever-increasing, ever-more complicated, and not least personal-information-related expectations could no longer be met with the help of conventional processing methods.[7]

A public administration that is increasingly developing into a benefits administration is more and more dependent on information about the persons receiving benefits in order to remain functional. Whether sickness or education allowances, child or housing benefits, old-age or accident pensions, the dialectic of benefit and the need for information manifests itself everywhere. Added to this is the medium- or long-term advance planning of government functions, which has long been a matter of course in many areas of public administration. The consequence is once again a considerable increase in requests for information addressed to the citizens. Conversely, however, citizens also increasingly expect a citizen-oriented, easily accessible and transparent administration in the sense of e-government. The development of an increasingly independent European administration has intensified and complicated this need for information, because it brings together different levels and different expectations: the monitoring of implementation by the Member States can only succeed if there is knowledge about it. If citizens and companies in Europe are to identify with the European idea, they must be aware of and have access to European programmes, services and commitments.[8] Altogether, the dilemma is clear: the structural change of public administration in the national as well as in the European context requires new forms of information processing. Hence, as long as the limitations of conventional processing methods cannot be overcome, the desired change threatens to fail simply because of an ever-worsening information crisis. The Covid 19 pandemic has impressively underlined this: a purely national response is hardly possible, and the exchange of information on many levels is unavoidable.

The question of different and better processing techniques is similarly urgent in the non-public sector. For example, whoever not only wants to meet the increasing information requirements of human resource management, but also wants to take the step towards a resource planning that is consistently adapted to their own production processes, whoever wants to communicate new forms of credit and payment as widely as possible and at the same time protect themselves against the associated risks, whoever includes ever more refined marketing strategies among the basic requirements of entrepreneurial policy, whoever wants to offer personalised, user-friendly services, or wants to develop new data-based business models in dealing with customer, employee or machine data, will inevitably abandon the previous forms of processing and link their further entrepreneurial activity with the demand for an information technology that will enable them to access the data required for the changed goals at any time and use it for the specific purpose.

From this point of view, the automation of data processing is not a product of chance, but rather the technologically compelling answer to the changes in information expectations directly associated with the changed structure of public administration tasks and the change in corporate focus. Consequently, neither the public administration nor private companies have hesitated to introduce and continuously expand automation. The advantage was all too obvious. For the first time, the opportunity to store any amount of data desired, to retrieve it effortlessly, to combine it at will and to use it for the most diverse purposes took tangible shape. The quantitative limits to the collection and storage of data, which had been considered insurmountable just a short time ago, appear just as obsolete as any doubts about a qualitative change in processing, illustrated not least by the speed and reliability of access and even more so by the multifunctional use of data that is possible at any time. With the development of ubiquitous computing, big data and artificial intelligence as well as the enormous spread of the corresponding hardware and software and the digitised programmes and services of all kinds based on them and for all areas of life, automated data processing has once again created further possibilities for the use and linking of data and raised automated data processing to a new level.

However, the other side of the coin cannot be overlooked. The very advantages that speak for automation increase the vulnerability of the individual, and indeed of society in general, many times over – if only because, thanks to the intensity of processing, erroneous information takes on a completely different significance. Automation accelerates the proliferation of misrepresentation and thus exposes those affected to the danger of economic, social or political discrimination on a scale that was previously difficult, if not impossible. At the same time, the possibility of external influence on decisions is increasing due to more and more knowledge about the decision-makers and their motives, values, preferences and resources. Because of the combination of information and information technology, these possibilities are increasingly in the hands of a few and thus pose anew the question of power and in consequence how to limit this power.

Every automation also at least tends to result in a loss of context – not only because the automation process, as experience shows, is usually associated with a reduction of the original information simply because of the transfer to the binary structures of digitisation, but above all in view of the fact that the multifunctional use of the data in question that is the aim of automation inevitably makes the individual details independent of the initial processing context. To put it differently and more concretely: the question of why someone was ill, why they did not show up at their place of work at certain times, why they refused to continue paying the instalments for the television set they bought, or what exactly the reasons were for the notes in the police files referring to them, fades more and more into the background. All that remains is the reference to the illness, the absences, the non-fulfilment of contractual obligations and the police record. Almost any of these details, however, can very quickly lead to further conclusions being drawn, that could make it considerably more difficult for the data subject to, for example, advance professionally or even get the job they want. Especially in connection with comparable, likewise de-contextualised data of others, especially in the age of Big Data, an image of the data subjects can quickly emerge that both forces them into templates and at the same time misrepresents them.

Finally, as an instrument of rationalisation and a planning vehicle, automation not only favours monopolisation tendencies in the distribution of information and thus potentially a distortion of decision-making processes. Public administration or corporate databases by no means only lead to more efficient information, they also establish an information advantage that can easily turn into dominance. Automation also creates the conditions for the intensive monitoring of the individual, which is often only a preliminary stage in the attempt to influence their ideas and decisions in the long term.

As in the private sector, the comprehensive merging of data is associated with the hope of linking scattered and barely manageable data collections in order to guarantee the efficiency and economy of public administration or business activities. However, those who, for reasons that seem entirely plausible from organisational and rationalisation points of view, seek to define the “ideal” insured person, customer, patient, employee, recipient of social assistance or suspect, consciously or subconsciously build a bridge to the permanent monitoring of those concerned, and ultimately also to the monitoring of their behaviour and to discrimination. If the specific organisational and rationalisation goal is to be more than non-binding self-reflection, then indeed, at first glance, everything speaks in favour of closely monitoring whether the reactions and activities of the individual patients, employees, insured persons and suspects, taking these as our examples, are in harmony with the concrete wording of the “optimisation” specifications, and how “deviating” behaviour that endangers the achievement of the specific goals in question can be prevented and even predicted in the future.

It was the combination of the tendency to monopolise information and the latent effects on the scope for decision-making and action of the persons affected by the processing of their data that initially prompted the Hessian and ultimately also the European legislator not to question the necessity of automation, but to make every further step towards automated processing dependent on compliance with certain legally defined conditions. In the Explanatory Memorandum to the 1st Hessian Data Protection Act,[9] it was therefore expressly stated: “In view of the new possibilities of collecting, storing, processing and communicating information, and the faster access to stored information, it seems advisable to supplement the existing regulations on secrecy and the distribution and monitoring of public powers in order to avoid an encroachment on the citizens’ private sphere and a shift of weight between the legislature and the executive, as well as between state administration and self-administration”. In this respect, the legislative measure was by no means an expression of undefined “fears” or even an irrational hostility to technology. From the very beginning, the debate addressed the social and political consequences of automated processing. The aim was and is to face up to them, and so they were at the heart of all debates on the necessity and scope of legal regulation. The automation of processing is not a problem to be considered and dealt with exclusively from technical and organisational points of view, but is an eminently constitutional and political question.

bb) Necessary juridification of data protection

Secondly, the decision of the legislators to lay down the processing requirements was based on the conviction that there was no alternative to legislation. In view of the knowledge about the possible uses, the early national legislators as well as the European legislators refrained from focussing primarily on the content of the individual data. The starting and focal point for the legislative intervention was and is the processing itself. In order to counteract its possible consequences, it was from the outset to be directed along binding pre-determined paths. The legislators therefore attached a clearly preventive function to their decision. From the outset, the purpose of the processing regulation was therefore not to supplement and expand existing legal provisions, such as the provisions on the protection of personality, the aim of which was to compensate for damage that had already occurred. Their primary task was rather to prevent processing-related risks with the help of mandatory standards of conduct. This is necessary in view of the special characteristics of information and the resulting special conditions of restitution and compensation for damage. At the same time, however, it also poses the problem of data protection, namely of warning of risks and dangers that cannot as yet be identified and because of which gains in efficiency and convenience resulting from the use of digital technologies would have to be foregone.

cc) Cross-sectoral nature of data protection law

Thirdly, in deciding to regulate processing requirements, legislators at both national and European level started from the premise that they would have to deal with problems that cut across the individual processing contexts and for which, in principle, it would easily be possible to develop a uniform solution system to be included in a single law. In other words, regulations were adopted that deliberately refrained from focussing on a specific processing context and instead formulated a global regulatory objective. Since then, data protection law has been a cross-sectoral issue. This also leads to the regulatory technique of general clauses, which were already to be found in the early laws and are now contained in the GDPR (→ mn. 36; 157 et seq.).

dd) Data protection and the power of knowledge

Finally, the legislature’s activity was also based on a desire to counteract an excessively privatistic understanding of data protection. In the literature, the processing regulations in Germany were often classified as a further example of the already rather long series of applications of the general right to the protection of personality, for reasons of greater legitimacy, but also with a view to making the legal classification as uncomplicated as possible.[10] After all, the issue concerns the conditions under which information may be used that relates to very specific persons, whose interests are thus directly affected. In this respect, it is indeed only too understandable if recourse is made to precisely that legal concept which is intended to protect the individual from violations of their personality.

However, the increasingly obvious threats to democracy and the rule of law posed by the use of personal data and the communication of the knowledge based on it to uninvolved third parties are a further legitimation for data protection going beyond the individual’s personality rights. To the extent that the processing of personal data is declared to be a subcategory of the general right to the protection of personality, the regulatory perspective also narrows. Dealing with processing and its consequences, with the resulting imbalances of power and restrictions of freedom, becomes a purely individual problem, for whose solution, apparently, only those principles can be considered that are otherwise also to be observed in the legal assessment of encroachments on individual legal positions. A typical example is the persistent attempt, at least in the past, to assign processing to a specific “sphere of personality”.[11] At least equally characteristic are constructions that seek to establish an individual, property-like defence right with the help of a “right to one’s own data”, or that even derive the processing requirements from the “data ownership” or the “data sovereignty” of the respective data subjects.[12] The consequences of the processing process for the communication structure and functioning of a democratic, constitutional and liberal society thus recede into the background. The field is then dominated by considerations about how a dogmatically convincing balance can be achieved between the claims to power of the individual “data owners”.[13] Hence, considerations that data are fundamentally available to the general public, that access and use should be made as broad as possible, are understandable under an individualistic approach to data protection, but fail to recognise its significance for democracy, freedom and the rule of law.

2. The Population Census decision of the Federal Constitutional Court

An important milestone in the development of not only German data protection law was the decision of the German Federal Constitutional Court (BVerfG) of 15 December 1983 on the 1983 Census Act.[14] Even though it was a national constitutional court decision, its impact has been felt far beyond Germany’s borders and the decision is often equated with the justification of fundamental data protection.[15]

Resistance to the census reflected the deep mistrust of information technology, the consequences of which were almost impossible to determine and which was becoming more and more ominous simply because of the rapid changes it was undergoing, and which, from the perspective of those affected, could all too easily lead to interventions aimed at controlling their behaviour, thus depriving them of the opportunity to shape their lives as they saw fit.[16]

It is precisely this background that determined the reasoning of the Constitutional Court. Specifically, the Court deliberately refrained from taking the most obvious route, namely that of deciding exclusively on the actual census law. The reaction of the population, their “fear of an uncontrolled collection of personal data” was, in the eyes of the Court, reason enough to first deal in detail with the requirements that the German constitution placed on the processing of personal data.[17] This is precisely why the significance of the decision extends far beyond the specific dispute. In its dicta, which were indeed deliberately programmatic in this respect, the Court – partly based on preliminary scholarly considerations[18] – set out the binding framework for all future considerations on the handling of personal data in Germany – at least until increasing Europeanisation – while also decisively indicating the path to be taken by the EU. Eight guidelines can in particular be derived from the decision.

Whether processing is lawful is, first and foremost, a question of constitutional law. Anyone who does not know whether and by whom information about them is being compiled is deprived of the opportunity to reliably assess the consequences of their behaviour as well as the reactions of their communication partners, and with increasing uncertainty forgoes the exercise of their fundamental rights. Processing not only threatens the “individual’s chances of development”, it also jeopardises the functioning of “a free democratic community based on the ability of its citizens to act and participate”.[19] The exercise of other fundamental rights depends on the existence of effective data protection. Strictly speaking, with these findings the court returned to the starting point of the discussion on the necessity of regulating the processing of personal data. It held that data protection, as the protection of fundamental rights, was the backbone of a liberal democracy.

Second, because the possibilities of the individual to develop as they see fit also and especially depend on the handling of data concerning them, the decision as to whether and to what extent the data in question is allowed to be processed must first and foremost be left to the individual. In a society in which, thanks to automated data processing, personal data can be stored indefinitely, can be “retrieved in seconds at any time without regard to distance” and finally “can be combined with other data collections to form a partial or largely complete picture of the personality without the data subject being able to adequately monitor its accuracy and use”,[20] “informational self-determination”[21] is therefore one of the basic prerequisites for the free development of the personality. The Federal Constitutional Court uses this concept of the “right to informational self-determination” as being synonymous with a right to data protection.

Third, the link between informational self-determination and the constitution forces possible restrictions to be regulated by law in a way that is both identifiable and clear to those affected,[22] a goal that can ultimately only be achieved with the help of sector-specific regulations, but one which requires a general framework.

Fourth, no matter how clearly the primacy of informational self-determination may be laid down, it is still not guaranteed without restrictions. Personal data by no means only provide information about the data subject, they are at the same time a “reflection of social reality”[23] and thus always tend to be the starting point for social, economic, and governmental activity, which is not least dependent on knowledge of the individual situation. However, the priority of informational self-determination means that the data subject may only be ignored in exceptional cases, or more precisely, only in those cases in which an “overriding general interest” argues in favour of processing.[24] In other words, the data subject’s decision is not a barrier that can be overcome at will. Moreover, the mere reference to an “overriding general interest”, however specified, is not enough. Rather, the data remain inaccessible as long as the specific information expectations in question have not been sanctioned by the legislator, namely with regulations that also and above all make it possible for the data subjects to understand at any time the conditions under which processing is possible.[25]

Fifth, the possible consequences of processing can only be assessed when there is certainty about the purpose for which the individual data are to be used. The openness of the use of data requires a connection with its use in order to assess exceptions. Therefore, there is no such thing as “insignificant” data.[26] A processing regulation must therefore be based on the use, not on the content of the data.

Sixth, both the data subjects and the legislator can only form a reliable picture of the potential consequences of processing if they are able to have an overview of the processing process that is as precise as possible. The more consistent the use of the probably most striking advantage of automated processing, namely the multifunctional use of the stored data in recombination with other data, the more opaque the processing process becomes. Under these circumstances, the only remedy is a strict purpose limitation,[27] which enables constant feedback between use and purpose.

Seventh, the obligation to gear every processing operation to a clearly identifiable purpose laid down from the outset restricts not only transfers between private bodies but also and in particular the forwarding of personal data within the public administration. The transfer must not jeopardise the commitment to the original purpose of the collection. Therefore, a transfer can only be considered if the data in question is needed by the agencies involved to fulfil this purpose. The public administration – just like interconnected companies in the private sector – is not an information unit within which personal data can be freely exchanged. This corresponds to the protection mandate in the private sector against ubiquitous and purpose-unrelated data transfers.

The effective protection of informational self-determination requires more than just the data subject’s prerogative to decide and targeted intervention by the legislator wherever this prerogative is to be restricted. Both certainly contribute to making the processing process more transparent and to disclosing its consequences in good time. The chances of data subjects to really understand the processing and administrative procedures are, of course, already limited with regard to the complexity of the processing procedure. Even more so, they are unlikely to be able to react to the constant changes in information and communication technology with proposals that could make it possible to anticipate the effects on the processing of personal data. Ultimately the same applies to legal regulations. They are inevitably limited to abstract specifications, i.e. they say nothing about the way in which the specific data is actually handled. Moreover, the legal regulations only ever reflect a certain state of technology. If, therefore, not only a maximum of transparency is to be ensured, but also the preventive function of the legal provisions guaranteed, the processing process must be subject to monitoring by a specially established, independent body, the supervisory authority, in each of its phases.[28]

With its Census decision, the Federal Constitutional Court adopted an approach to the definition of the conditions for processing personal data that was new and also paved the way for developments in Europe. The findings made in that decision can be transferred almost directly to the European constitutional concept of data protection. The direct link between informational self-determination and the Basic Law (the German constitution) excludes any doubt about the significance of data protection laws, the unambiguous demand for a clear purpose limitation permanently restricts the scope of processing from the outset, the explicit affirmation of the necessity of an independent supervisory authority finally strengthens the position of the supervisory authorities and – long before the primary legal anchoring in Art. 8 (3) CFR – rejects all efforts to undermine their rights.

The right to informational self-determination has subsequently experienced an impressive history of diffusion both in the judicial practice of the Constitutional Court and other courts as well as in the literature.[29] With shifts in normative content (caused by other judicial practice traditions and other textual bases), this also applies to the judicial practice of the CJEU, the ECtHR and courts of other Member States and in third countries.

In the aftermath of the decision, various adjustments were made by the legislature in German law – despite all resistance to the decision itself – in order to take into account the requirements of the Federal Constitutional Court. In addition to general regulations, the focus was on sector-specific regulations. However, the extent to which a separate regulation is necessary or whether certain expectations that are equally valid for a number of other processing contexts and therefore formulated in more general terms are sufficient, can only be answered when there is clarity about the parties interested in processing, their motivations and goals, the intended processing modalities as well as the desired data, and thus also and especially about the potential effects. Moreover, a sector-specific response does not necessarily require a specific legal regime, conceived as a counterpart to data protection laws, exclusively limited to a precisely defined processing context. On the contrary, the purpose and scope of the processing provisions in question can only be properly assessed if they are conceived from the outset as part of an overall regulation which primarily serves to define the structure, function and competences of the specific body responsible. Moreover, the scope of regulation is once again determined by the requirements arising from the specific processing context.

II.International Development

 

  1. National laws

The international development of data protection can be divided into four stages. In the first phase, from 1970 onwards, regulatory models were developed in various countries, which in part differed greatly. Secondly, these were consolidated by the successful preparation and adoption of Convention 108 of the Council of Europe in 1981, which subsequently served as a model for further development. This adoption was followed by a phase of considerable national legislative activity characterised by mutual learning processes. The fourth stage of development began with legislative activities within the EU and led in two stages (DPD and GDPR) to the increasing supra-nationalisation of (European) data protection law.

a) Statutory origins and competing national regulatory models

The first stage[30] began with the 1970 Hessian Data Protection Act and continued with the Swedish Data Protection Act of 1973, the US Privacy Act of 1974, the Privacy Committee Act of the Australian province of New South Wales of 1975 and the Canadian Human Rights Act of 1977, as well as the German Federal Data Protection Act of 1977. This was followed in 1978 by the French Law on Electronic Data Processing, Files and Liberties, the Danish Private Registers Act, supplemented in 1987 by a Public Authorities Registers Act, the Norwegian Personal Data Registers Act and the Austrian Data Protection Act. Finally, a year later, the Luxembourg Law on the Use of Personal Data in Data Processing was adopted, the last legislative regulation before the 1981 Council of Europe Convention on Data Protection (→ mn. 41 et seq., 54 et seq.), which also marked the end of this first period.

Unlike in most cases of legislative intervention, the legislators were nowhere able to fall back on extensive case material collected over many years, possibly also developed with the help of a large number of judicial decisions. Rather, the problems associated with automation were only beginning to emerge, but could not continue to be neglected if the legislature were to preserve its chance to absorb the political and social consequences of the radically changed processing conditions. The legislative task thus suffered from an unusually high degree of uncertainty. From this point of view, the legislator had no choice: it had to give preference to forms of regulation that used a maximum of flexibility to compensate for the disadvantage of an only vaguely describable subject matter. However, the reaction was by no means uniform. Strictly speaking, three regulatory models can be distinguished:

The German and Austrian Data Processing Acts[31] were particularly typical examples of an all-encompassing regulation and were accordingly very general. In other words, the legislator’s lack of information was hidden behind a large number of general clauses. The legislator saw their elasticity as the most suitable means of achieving both a conceivably comprehensive regulation and one that was as adaptable as possible.

The Swedish legislator took a different approach.[32] The information deficit was not even disputed or veiled. The legislator therefore openly refrained from linking the processing procedure to conditions formulated in detail, but instead limited itself to establishing a supervisory authority and at the same time making every automated processing of personal data dependent on an authorisation from this authority. The focus was thus on a procedure that forced the disclosure of the processing purposes and modalities. The authorisation requirement gave the supervisory authority the opportunity to react specifically, to formulate its expectations on the basis of the situation at hand, and thus to maximise the protection of data subjects by imposing precise conditions.[33] For these very reasons, the Norwegian legislator even went a step further. Unlike in Sweden, the authorisation requirement was not limited to automated processing, but also covered personal data registers containing sensitive data, such as information on the health of the data subjects or their criminal record.

The USA in particular pursued a third, different approach.[34] Although the legislator was prepared to include certain ideas on the course of the processing procedure in the legal provisions, it deliberately refrained from a general regulation in favour of a clearly sector-specific approach. Typical of this was the Fair Credit Reporting Act of 1970. The granting of credit was one of the cases that sparked the debate on the dangers of a “dossier society”, especially in the USA.[35] It was therefore not surprising that the federal legislature, especially against the background of increasing consumer protection, focused its first reaction to the processing of personal data entirely on credit institutions.[36] The reason for the legislative intervention, however, also determined its limits: it was by no means intended to deal with data protection issues in general, but rather only to address the problems that arose in the field of consumer credit.

The same applies to the legal regulations passed later, for example within the framework of the Public Health Services Act, the Drug Abuse Office and Treatment Act or the Family Educational Rights and Privacy Act.[37] There, too, the legislator remained with a decidedly selective reaction. It was only logical that the Commission provided for in the Privacy Act to assess its impact expressly refused to recommend a general data protection regulation and instead only called for further sector-specific regulations, for example for the processing of employee data, the use of personal information in the insurance sector or the better protection of social data.[38]

Not every law corresponded to one of the three regulatory models. Some legislators tried to combine the different approaches. For example, the French Data Protection Act[39] was initially reminiscent of the Swedish regulation. As there, there was an obligation (Art. 15) to process personal data only after authorisation by the Commission Nationale de l’Informatique et des Libertés (CNIL). However, unlike in Sweden, the law did not see authorisation as the actual and decisive means of regulation, but only as an additional protective measure, in principle limited to the public sector. For non-public bodies, on the other hand, only a notification was provided for (Art. 16). Consequently, the focus shifted to a multitude of material statements on processing (Art. 25 et seq.), which were completely in line with the corresponding provisions in Germany or in Austria.

b) The influence of Convention 108 of the Council of Europe

The internationalisation of processing principles was instrumental in overcoming the passive, if not hostile, attitude of some legislators. At the latest since the Council of Europe’s Convention 108 for the Protection of Individuals with regard to Automatic Processing of Personal Data (→ mn. 54 et seq.),[40] it was impossible to ignore the danger of not being able to participate in international data exchange at all in the future due to the lack of a national data protection regulation, thereby both losing access to a large amount of information that is particularly important for individual business sectors and having to forego processing orders that are no less economically interesting. The history of British data protection legislation is probably the best example of this. It was only under the impact of the Convention and its possible consequences that the British government was prepared to introduce a draft that was entirely in line with the Convention requirements.[41]

c) Consolidation and mutual learning processes

After 1980, a revision of the original regulatory ideas began. In the meantime, it had become all too clear that none of the regulatory models could really do justice to the task of linking the course of processing to effective conditions and at the same time ensuring essential preventive control.

The licensing model, for example, was hardly manageable simply because of the excessive bureaucratisation inevitably associated with its implementation.[42] The more self-evident it seemed to process personal data automatically, the harder it was to keep up with the growing number of licence applications. The expectation of controllers being able to prescribe precise requirements adapted to the concrete processing situation thus became more and more a fiction. It is no coincidence that the focus in France increasingly shifted to the simplified notification provided for in Art. 17 of the Data Protection Act.[43] This reaction, which was superficially indeed helpful, admittedly abandoned the aim to integrate processing from the outset into a regulatory scheme that was as concrete as possible, an experience that was repeated in the “national models”[44] developed by the CNIL.

In the same way, however – and this is also of relevance to the GDPR (→ mn. 157 et seq.) – the regulatory model that thought it could manage with few provisions that were as general as possible was not entirely convincing. The general clauses did not prove to be ideal instruments for a continuous adaptation to the manifold effects of a constantly changing processing technology. The deliberately vague wording of the legal provisions practically provoked very different, even contradictory interpretations and thus strengthened the position of the controllers. In this respect, it was left to them, at least for the time being, to set the accents of interpretation in favour of their interests.

Finally, the consistently sector-specific regulatory model has not helped to address the processing problems efficiently and convincingly. If the legislator is deliberately content with selective interventions and is not guided by a regulatory concept that clearly defines the objectives of intervention as well as a set of processing principles, then, as the example of the US Computer Matching Act of 1988[45] clearly shows, the result must inevitably be a scattered regulation.[46] Under these circumstances, the controllers see themselves more than ever in a position to push through their expectations, especially when there is not even a supervisory body that could, with the help of a systematic review of the various processing areas, ensure that at least the most glaring contradictions between the various regulations are mitigated.[47]

In short, none of the first regulatory models developed between 1970 and 1980 can claim to have shown the right way to deal convincingly with the processing of personal data. Rather, each of them emerged under specific historical conditions and therefore only provided an understandable and legitimate basis for reflection on a processing regulation as long as these conditions remained unchanged. Only data protection legislation that is understood as a permanent learning process, not least characterised by changes in processing technology, can fulfil its task.

Understandably, the attempts for changes in law in the individual countries[48] did not always proceed in the same way. Nevertheless, they showed a number of common tendencies. The more the emphasis shifted from general considerations on data protection to the question of the effectiveness of normative requirements, the more differentiated the regulatory process became. The legislator indeed held on to the goal of formulating generally applicable requirements. Almost always, the issue was still the hard core of processing principles that can be found in the German Constitutional Court’s Population Census decision (→ mn. 19 et seq.) as well as in the Council of Europe’s Data Protection Convention (Art. 5; → mn. 54 et seq.). The actual regulation of data processing, on the other hand, increasingly shifted towards provisions that were directly tailored to individual processing situations.

The legislators also insisted more firmly than ever on comprehensive monitoring. The supervisory authorities were therefore increasingly granted rights that allowed them preventive interventions as well as corrective interventions up to and including a ban on processing.[49] Of course, these powers of intervention came at a price. The supervisory authority that was given more powers had to be submitted to (judicial) control.

Finally, the scope of application of the processing requirements determined by the law expanded considerably. The majority of data protection laws were only interested in whether personal data were to be processed, i.e. they did not first ask about the form of processing. Consequently, the tedious disputes about when there was automation receded into the background. Instead, what dominated was the attempt to limit the scope of application functionally, i.e. to exclude processing carried out for purely “personal reasons” or merely for “normal private purposes”.

d) The supranationalisation of data protection law

The fourth stage of development is marked – specifically in Europe – by a noticeable decline in the regulatory competence of national legislators. Since 1995, the EU Member States’ scope for action and decision-making has basically been governed by the provisions of the Data Protection Directive (DPD) and all other legal acts of the Union relating to the processing of personal data (→ mn. 94 et seq.). Unlike the Council of Europe’s Data Protection Convention (→ mn. 54 et seq.), the Member States had no choice. It is therefore hardly surprising that Member States such as Greece and Italy, which still had no regulations despite long discussions, some of which went back to the 1980s, were the first to pass data protection laws.[50] It is equally unsurprising, however, that the implementation of the DPD proved to be particularly difficult in countries where data protection could boast a long tradition, such as in France and Germany, and even required legal action by the Commission in these cases. The efforts on the part of each country to push through as many of its own ideas as possible, which had already become apparent during the deliberations on the Directive in the Council,[51] were repeated in the implementation.

In fact, the DPD’s sphere of influence extended beyond what was then the EC. The obligation to transfer personal data only to third countries that guaranteed an “adequate level of data protection” (Art. 25(1) DPD, now Art. 45(1)) favoured and promoted a progressive harmonisation of data protection legislation. Art. 25 DPD did indicate that the routes to a “level of protection” corresponding to the Directive could be quite different.[52] However, the common goal did not remain without consequences for the regulatory content. The increasingly intensive, at times global, exchange of personal data strengthened the interest in processing rules that were as uniform as possible. Rules that are largely coordinated with each other are one of the most important prerequisites for fast and smooth access to the desired data. In the same way as the Council of Europe’s Data Protection Convention (→ mn. 54 et seq.), the DPD has therefore developed into a regulatory model used internationally. The new versions of the Icelandic and Norwegian Data Protection Acts adopted in 2000, which were adapted to the DPD, are typical examples. The Council of Europe has also increasingly sought to take the DPD into account in its deliberations on new recommendations. Finally, where “adequacy” was examined and determined by the Commission, the influence of the Directive also increased.[53]

The GDPR – together with the planned ePrivacy Regulation – is the provisional conclusion of this supranationalisation of (European) data protection law, even if considerable effort is currently being applied to subject other sub-areas of digitisation to regulation (for example through the Digital Services Act, Digital Markets Act, Artificial Intelligence Act). With the switch to the instrument of a Regulation and the considerable expansion of the regulatory scope, including the establishment of the EDPB as the final decision-making authority in the consistency mechanism, the Member States are losing a considerable amount of legislative and administrative decision-making power to Union level. This applies despite all the opening clauses of the GDPR. Due to the regulatory content of the GDPR and the prohibition to introduce national rules which are confined to reproduce provisions of Union law,[54] all that remain are rump regulations. These may still reflect specific national features in some areas, but can no longer claim to be a coherent regulatory approach of their own. This also applies to the public sector. Here, there is still considerable national leeway compared to private data processing, but the legislator is losing “basic regulation” through a data protection law that applies generally and subsidiarily. It remains to be seen whether this will lead to further fragmentation and inconsistencies in sector-specific data protection law due to the lack of a general national data protection law.

In addition, the GDPR significantly advances the adaptation of the laws in non-EU countries through the market place principle: not only does adequacy remain the determining criterion for permissibility in the field of transfers to third countries, but the GDPR now also actively requires compliance when data is processed in third countries if it is connected with the offering of goods or services to data subjects in the Union or with monitoring their behaviour in the Union (Art. 3(2)). The GDPR thus has de facto even more far-reaching effects on the global design of data protection law. It remains to be seen what impetus this will generate in the long term. However, it is already apparent that important trading partners such as Japan and Brazil are shaping their data protection laws accordingly, and the GDPR has also led to reform efforts in the USA (e.g. CCPA).

2. International regulations

a) Council of Europe

aa) History and systematics of Convention 108

No other international organisation has had anywhere near as lasting an influence on the development of data protection – at least until the GDPR – as the Council of Europe. As early as 1968, the Consultative Assembly had asked the Committee of Ministers to examine whether the 1950 European Convention on Human Rights (ECHR) and the national legal systems were sufficient to protect the individual against the consequences of technological change in the use of personal data. The answer, however, could only be negative, also and especially as far as the ECHR was concerned. The individual’s right to “respect for his private and family life”, guaranteed in Art. 8, had not been further developed into a fundamental right to data protection (on later developments → mn. 126 et seq.).[55] Five years later, the Committee of Ministers adopted a first resolution on the processing of personal data in the private sector.[56] In 1974, the second resolution followed, this time dedicated to the public sector.[57] In 1976, the Committee of Ministers mandated a commission of experts to prepare a convention that would also take into account the cross-border exchange of data.[58] A draft was presented by the commission in May 1979. A year later, the Committee of Ministers adopted the text, which was revised by a second group of experts. Nevertheless, it was not until 28 January 1981 that the “Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data”[59] was finally presented for signature. After ratification by France, Norway, Sweden, Spain and Germany, it entered into force on 1 October 1985, initially between these five states (Art. 22(2)).

From the very beginning, the Council of Europe was keen to ensure that the Convention had a broad impact. For this reason, it not only sought intensive cooperation with the OECD,[60] but also advocated an open convention, i.e. not limited to its Contracting States (Art. 23). The Convention contains a number of suggestions which on the one hand provide starting points for national regulations, but on the other hand are also intended to guarantee a minimum degree of conformity between these regulations. Art. 4(1) calls on the Parties to take the necessary measures to give effect to the basic principles for data protection set out in the Convention. Accordingly, the Convention is not even partially directly applicable.[61] On the contrary, the Convention is, according to its entire history and intention, a “non-self-executing treaty” from which no rights can be derived directly.[62]

bb) Essential normative content

The scope of application of the Convention covers the automatic processing of personal data (Art. 1) in both the public and private sectors (Art. 3(1)). Dealing with the consequences of automation seemed to be the most urgent task for the Council of Europe.[63] However, the Parties can explicitly decide whether they will also apply the Convention to manual processing (Art. 3(2)(c). Conversely, according to Art. 3(2)(a), it is permissible, after notification, not to apply the Convention to certain types of automated files or data collections.[64] Art. 3(1) in conjunction with Art. 2(a) limits the scope of application to the protection of natural persons against the consequences of automation. At the time of drafting, this was not as self-evident as it is today. In fact, amongst the first Data Protection Acts, a number of countries had argued against the restriction to natural persons, albeit for quite different reasons.[65] The Council of Europe therefore opted for a compromise. It is for the national legislator to decide whether the Convention is applicable to groups of persons, associations, foundations, companies, corporations or other bodies with or without legal personality (Art. 3(2)(b)).

The Convention lays down five processing principles (Art. 5) which, together with the special provisions on the processing of sensitive data (Art. 6) and the rights of data subjects (Art. 8), form the “hard” core of data protection, to be covered by specific domestic sanctions (Art. 10):[66] personal data must, firstly, be collected and processed lawfully and fairly; secondly, be used only for specified legitimate purposes; thirdly, be relevant to the purpose of the processing and adequate in scope; fourthly, be accurate and kept up to date; and fifthly, always be preserved in such a way that data subjects can be identified for no longer that is necessary for the purpose of the processing.

Special rules apply to sensitive data (Art. 6).[67] This approach has now been established in Art. 9 GDPR via Art. 8 DPD, but has still not been implemented satisfactorily (→ Art. 9 mn. 12 et seq.).

It is hardly surprising that despite all attempts[68] it has not been possible to develop standards that allow sensitivity to be determined beyond doubt. Equally unsurprising are the ever-lengthening lists of cases in which the initially very restrictively worded conditions are relaxed, with the result that processing ultimately becomes the norm. The decisive factor is not the data, but rather the specific processing context.[69] In the same way, it cannot be denied that the degree of sensitivity of the processing of health data varies greatly depending on whether it is to be processed by a hospital or a credit agency.

Art. 6 lists the sensitive data. They include information on racial origin, political opinions, religious beliefs and criminal record, as well as data on health and sexual life. However, the list is not exhaustive, and national legislators are free to add to it.[70] Art. 6 does not prohibit the automated processing of these data, but suspends it until “adequate” safeguards are introduced under the domestic law in question. The seemingly categorical statement in favour of stricter processing conditions thus loses much of its meaning.

Art. 8 lays down the rights of the data subjects, namely the rights to information, correction and deletion. The data subjects are not only to be informed about the specific data used and the purpose of the processing, but also about the habitual residence or the registered office of the controller (Art. 8(a)). Art. 8 also requires regular information in a form that is intelligible to the data subjects (b). The national law must also provide for legal remedies for enforcement (Art. 8(d).

cc) Organisational requirements

Due to differences of opinion among the Contracting States, the Convention itself does not contain any provisions on independent supervisory bodies. It was only under the influence of further national data protection laws and Art. 28 DPD that this was made good in an Additional Protocol to the Convention.[71] Just like the DPD, the Additional Protocol refrained from fixing on one of the existing national supervisory models.

The supervisory authority shall exercise its duties in “complete independence” (Art. 1(3) of the Additional Protocol). The Council of Europe thus reiterated and confirmed the basic principle of credible, effective processing monitoring enshrined in the national laws and Art. 28(1) second sentence DPD. The Additional Protocol also provides that the supervisory authority must have powers of investigation and intervention and must have the right to bring an action before the competent court in the event of a breach of the processing standards to be implemented by the national legislators (Art. 1(2)(a)), emphasises the right of every individual to appeal to the supervisory authority (Art. 1(2)(b)), advocates the possibility of a judicial review of the decisions of the supervisory authority (Art. 2(4)) and considers the cooperation of the supervisory authorities as part of the cooperation between the contracting parties which the Convention (Arts. 13 et seq.) aims for in any event.

dd) International data transfers

The Council of Europe adopted rules on transborder data flows in Art. 12. According to paragraph 1, these apply not only to automatically processed data, but also to data that has been manually collected for such processing. This means that automated transfer is not a requirement; the means of transport is irrelevant.[72] Paragraph 2 stipulates that the mere reference to the protection of the privacy of the data subject does not legitimise preventing the transfer of personal data to the territory of other Parties or making it subject to special authorisation. In the conflict between “free exchange of information” and “data protection”, the Convention, despite all claims to the contrary, by no means opts for a basically unhindered international circulation of personal data. On the contrary, only the existence of a minimum regulation accepted by all contracting states paves the way for the transborder transfer of data. This is also expressed in Art. 2 of the Additional Protocol, which in principle binds data transfers to recipient countries that are not parties to the Convention to an adequate level of protection.

Restrictions below the level of prohibitive interference, such as a reporting obligation, remain permissible. The national provisions also form the yardstick for the permissibility of the transfer.[73]

The Convention creates an exception to the rule in Art. 12(2) in two cases. Firstly, if national law contains specific regulations for certain categories of personal data or automated files/collections of files because of the nature of these data, higher requirements may be imposed if there are no “equivalent” safeguards in the recipient state (para. 3(a)). The condition is similar to Art. 6, but is not congruent with it, since para. 3(a) does not list any categories.[74]

The second exception is intended to prevent attempts to use Contracting States as intermediaries in order to process personal data in a country that is not bound by the provisions of the Convention (Art. 12(3)(b)). Where the transfer is merely intended to pave the way to such a “data protection-free” third country, each member state is entitled to prohibit the transfer. However, the application of Art. 12(3)(b) is complicated, as the intention to transfer will be difficult to prove and it is unclear whether the rule also applies if the data are not transferred on immediately but are first processed in the intermediate country.[75]

The Convention itself does not say anything about the transfer of personal data to states that have not ratified the Convention.[76] Only the Additional Protocol (→ mn. 63) brought the necessary clarity here as well. Like Art. 25(1) DPD, Art. 2(1) of the Additional Protocol requires that transfers be limited to third countries that have an adequate level of data protection. Unlike in the DPD, however, there is no mechanism to determine this level. The exceptions are similar to Art. 26 DPD, but much more general: specific interests of the data subjects and legitimate interests, especially public interests (Art. 2(2)(a)), as well as guarantees that can arise from contractual clauses, if these are deemed sufficient by the relevant supervisory authority (Art. 2(2)(b)).

ee) Mutual assistance

In the interest of both a consistent and uniform application of the Convention, it obliges the Parties to assist each other (Art. 13(1)) and at the same time prescribes a procedure to ensure, on the one hand, a smooth exchange of information on the legal as well as the factual situation of data protection in the individual Contracting State (Art. 13(3)) and, on the other hand, to give data subjects the opportunity to assert their rights as laid down in Art. 8 of the Convention even if they are abroad. Both tasks are to be performed by bodies that must be specifically designated by the individual Party (Art. 13(2)).

The text of the Convention does not contain any indication as to which authority the Council of Europe believes is most likely and best able to take on these tasks. The Explanatory Memorandum indicates a clear preference for the bodies entrusted with the supervision of data protection.

Although the procedures provided for in Arts. 13 et seq. contribute to a better application of the Convention, they are not suitable for ensuring maximum uniformity in its application, nor for correcting or even further developing the Convention. A Consultative Committee was established for this purpose (Art. 18). Its function is to make proposals for facilitating or improving the application of the Convention and for necessary amendments, but also to comment on individual questions raised by the Parties (Art. 19).[77] Whether these goals can be achieved depends to a large extent on the composition of the Committee. Until the Convention entered into force, most Contracting States had appointed independent experts, whereas now they tend to appoint purely government representatives, a decision which deprives the Committee of much of the chance of an open critique, also and especially with a view to updating the Convention.

ff) Modernisation of the Convention

Even though the Council of Europe’s Data Protection Convention, like any data protection regulation, reacts to information technology and must therefore be subject to the proviso of a continuous, technology-oriented review, the Council has refrained from a periodic revision. Reflections on “modernisation” only began in 2010, not least with reference to the increasing importance of cloud computing and social networks.[78] In 2012, the Consultative Committee presented detailed proposals for a revision of the Convention,[79] which were consolidated by an ad hoc committee on data protection (CAHDATA) in 2016.[80] The main proposed innovations include:

a) a clear specification of the data protection principles in Art. 5

b) an extension of the catalogue of sensitive data (Art. 6);

c) specifications on data security and an obligation to notify (only) the supervisory authority in the event of serious security breaches (Art. 7);

d) significantly increased transparency of processing for data subjects (Art. 8);

e) an extension of data subjects’ rights (Art. 9);

f) obligations regarding technical and organisational measures, including requirements for the form of processing operations (Art. 10)

g) precise information on the requirements for adequate data protection in the case of transborder transfers of personal data (Art. 14(3)).

The proposal for the revision of the Data Protection Convention was adopted by the Committee of Ministers of the Council of Europe on 18 May 2018. It will enter into force in accordance with Art. 37 of the Protocol as soon as all Parties to the Convention have ratified it, or alternatively on 11 October 2023 (or thereafter) if 38 ratifications have been received.[81]

However, even before this revision activity, the Council of Europe was very soon forced to realise that general processing principles as contained in the Convention were not sufficient. It therefore argued very early on in favour of supplementing the Convention with a series of recommendations. The Committee of Ministers adopted the first of these recommendations (on automated medical databases) five days before the Convention was opened for signature.[82] Recommendations followed e.g. on the use of personal data in the areas of scientific research and statistics, direct marketing, social security, police activities, labour relations, payments, information provided to third parties by public authorities, telecommunications services, the handling of medical data, the internet, insurances, profiling, social networks, search engines, internet intermediaries, net neutrality, the rights of children, and algorithmic systems.[83]

As different as the starting points were, one conviction clearly runs through all the recommendations: the effectiveness of data protection depends crucially on the ability to react specifically to concrete processing situations that are particularly important from the perspective of the data subjects.[84] In the course of this area-specific concretisation, some of the positions taken by the Convention were also called into question, for example the limitation to automated processing (apart from the exception in Art. 12(1)).[85]

Although recommendations offer the opportunity to react more flexibly and innovatively, they have the disadvantage of being completely non-binding. Nevertheless, their impact should not be underestimated. Not only are they linked to a convention whose international appeal is undisputed, but they also consistently deal with issues that play a central role in each of the Member States. The recommendations document the consensus on the need to define the processing conditions in the specific areas addressed and clearly indicate what the benchmarks of such a regulation would have to be. It is therefore hardly surprising that individual Parties have repeatedly felt compelled to make use of the possibility to make reservations.[86] However, each reservation also confirms that the Council of Europe’s recommendations are certainly capable of influencing national law and further developing the regulation of the processing of personal data.

gg) Relationship with the Union and ongoing relevance of the Council of Europe’s regulations

The Convention, the Additional Protocol and the Recommendations illustrate the intensity with which the Council of Europe has dealt with data protection issues for almost four decades, thus also creating the conditions for a broad international recognition of data protection. However, since the adoption of the DPD and even more so of the GDPR, its influence has continued to decline, especially since the Commission is more concerned than ever with ensuring that the EU states in the Council of Europe adopt a uniform position as far as possible. The participation of Commission representatives in the Council of Europe’s committees and especially in the “Consultative Committee” is just as indicative of this as the immediate reactions of the Commission to individual regulatory proposals.

With the DPD and also under the application of the GDPR, the Convention has now acquired special significance for the members of the Council of Europe that are not also members of the Union or the EEA,[87] as well as for states that accede to the Convention at the invitation of the Committee of Ministers in accordance with Art. 23.[88] Following Brexit,[89] the Convention could gain a significant renaissance in this respect, unless the United Kingdom gives notice of this too. For the Member States of the Union, on the other hand, the Convention only has an auxiliary function. In this respect, it is hardly surprising that the Council of Europe is also adapting its provisions to Union law, as can be seen from the Additional Protocol and the current reform debate.

b) OECD

The Organisation for Economic Co-operation and Development (OECD), like the Council of Europe, came out in favour of regulating the processing of personal data at a very early stage.[90] Unlike the Council of Europe, the focus was not on the consistent extension of a fundamental rights document (the ECHR), but on the fear that national data protection regulations could at the same time be the harbingers of a new protectionism.[91] From the OECD’s point of view, an international regulation on transborder data exchange therefore had the primary task of preventing potential trade barriers and protecting the free flow of information.[92] The preparatory work finally led to the “Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data”[93] adopted by the OECD Council on 23 September 1980. In contrast to the Council of Europe Convention, this is not a binding document under international law, but only proposals.[94] The 1980 Guidelines were replaced by a revised regulation in 2013 in view of the development of information technology and the globalisation of data processing.[95] In addition to various streamlining measures, this revised regulation contains a number of new provisions.

The Guidelines begin with a general section and include public and non-public bodies alike (No. 2). The Guidelines proper are introduced with two explicit warnings. First, they must not be interpreted in a way that could lead to the exclusion of differentiated expectations adapted to the nature of the data and the processing context, and second, they must not unduly restrict freedom of expression (No. 3). No. 4 requires that security measures as well as references to public policy be kept as few as possible and also that they be publicly disclosed. Finally, No. 6 explicitly defines the Guidelines as minimum conditions that could be supplemented and expanded, even if this were to make international data exchange more difficult.

The Guidelines lay down eight processing principles (Nos. 7-14), which are in line with the principles of the Council of Europe Convention (→ mn.54 et seq.). They provide for the obligation to collect personal data lawfully and, where appropriate, with the knowledge or consent of the data subjects (No. 7), the need for clear purpose limitation (Nos. 8-10), the obligation to take security measures (No. 11) and to ensure transparent processing (No. 12). Data subjects have a right of access (No. 13) and the controller is responsible for compliance with the principles (No. 14). In 2013, the revised Guidelines specifically added a No. 15 containing the obligation to operate a “privacy management programme” and to demonstrate its compliance to the supervisory authorities, who must also be notified of significant security incidents.

The transborder exchange of data, which is of particular importance to the OECD, is addressed in Nos. 16-18. Controllers remain accountable for the handling of the data they use regardless of where the data are located (No. 16). No. 17 requires that data exchange should not be restricted if the recipient state either basically applies the Guidelines or provides sufficient safeguards to ensure continuous protection in accordance with the Guidelines. Restrictions on transborder flows must be proportionate to the risks presented, taking into account the sensitivity of the data, and the purpose and context of the processing (No. 18). The internationalisation of processing is one of the most important regulatory areas for the OECD, as is also expressed in the regulations on increased cooperation between member countries and on interoperability (Nos. 20-23).

No. 19 contains measures for the implementation of the Guidelines. These include national privacy strategies, laws protection privacy, supervisory authorities, self-regulatory mechanisms (particularly codes of conduct), enforceable data subjects’ rights, adequate sanctions and remedies in case of failures to comply with laws protecting privacy, complementary measures (i.e. educational measures and awareness raising), the involvement of actors other than data controllers, and measures to prevent unfair discrimination. The instruments of self-regulation are also placed on the same level as legal regulations in No. 19(d) of the new Guidelines (in contrast to the model of the Council of Europe).[96] Consequently, in the case of transborder flows, legal systems that do not have any processing regulations but where the controllers have recourse to relevant provisions which they have drafted themselves, would have to be regarded as equivalent. However, such an interpretation would be in clear contradiction to the principles of the Council of Europe and Arts. 44 et seq. GDPR.

The OECD’s primary interest can be seen above all in its repeated efforts to protect transborder data exchange from disruptive restrictions and, at the same time, to firmly advocate self-regulation for both nationally focussed and internationally oriented processing. The guidelines on cryptography[97] and the declaration on data protection in global networks[98] already pointed in this direction.

The efforts supported by the OECD to replace interventions by the supervisory authorities and even more so by the courts with mediation procedures specifically tailored to data protection conflicts are certainly comparable with a tendency towards a maximum autonomy of self-regulation. However, to the extent that binding processing principles are laid down nationally and supranationally, especially against the background of a constitutional safeguarding of data protection, there can be no arbitrary handling of the legal guarantees. Accordingly, mediation procedures may facilitate the overcoming of processing conflicts, but they cannot be seen and treated as a waiver of or even an antidote to the application of the constitutional requirements as well as the legal provisions directly linked to them.

c) United Nations

Historically, the UN’s efforts to develop data protection principles[99] go back just as far as the efforts to the same end by the Council of Europe (→ mn. 54 et seq.). On 19 December 1968,[100] the General Assembly requested the Secretary General to examine the impact of scientific and technological development (including automated processing methods) on human rights and to consider what restrictions such methods should be subject to in a democratic society.[101] At their request, the Sub-Commission on Prevention of Discrimination and Protection of Minorities (now the Sub-Commission on the Promotion and Protection of Human Rights) submitted draft guidelines on 23 June 1985 on behalf of the Commission on Human Rights on the processing of personal data in automated files.[102] A slightly modified version was adopted by the Commission on Human Rights[103] in 1988 and approved by the General Assembly on 14 December 1990.[104]

Like the OECD Guidelines (→ mn. 80 et seq.), the UN Guidelines contain only a series of recommendations that are explicitly limited to a “minimum standard” (Part A – heading) and are addressed to the Member States. Part B then declares that the processing principles formulated to address the Member States are also to be the basis for the internal regulation of international organisations (Part B). Like the Council of Europe Convention on Data Protection and the OECD Guidelines, the UN Guidelines refrain from proposing a processing regime only for specific entities, and cover both the public and private sectors.

The Guidelines nevertheless allow for an exception: Member States are free not to adhere to the processing principles that would otherwise have to be observed in the case of files that serve “humanitarian aid” or the “protection of human rights and fundamental freedoms”.[105] The UN has thus attempted to take into account the processing conditions that are particularly typical for the International Red Cross, Amnesty International or the UN High Commissioner for Refugees.[106] None of these organisations could in fact help the victims of political persecution or racial discrimination, for example, if they were only allowed to store data with the consent of the data subjects. The Guidelines therefore tolerate a modification of the generally applicable processing principles without, however, specifying how far it may extend in detail. This leaves considerable room for manoeuvre in view of the significant differences between the organisations in question.

In the processing principles, the UN Guidelines closely follow the Council of Europe’s Data Protection Convention and the OECD Guidelines. For example, in addition to the obligation to not collect or process data in unfair or unlawful ways (1) and to process only accurate, relevant and complete, and also continuously updated data (2), the Guidelines provide for purpose specification and limitation (3), guarantee certain rights to data subjects (4) and require special rules for sensitive data (5). In one respect, however, the Guidelines go further: they are the first international document to explicitly require the establishment of an independent supervisory authority (8). Finally, the Guidelines also take a position on transborder data flows (9). Like the Council of Europe’s Data Protection Convention, they give priority to data protection but avoid engaging in a similarly complicated regulation. Instead, they merely state that an exchange must proceed freely as long as comparable safeguards exists in the recipient state. If this is not the case, only the limitations necessary to protect the data subjects may be provided for.

At UN level, legal developments have also been significantly influenced by the rulings of the Human Rights Committee (→ mn. 54). Moreover, the revelations of the NSA’s surveillance practices have led to further action. Largely at the initiative of Germany and Brazil, the UN General Assembly adopted resolutions on the “right to privacy in the digital age” in 2013 and 2014.[107] In 2015, a Special Rapporteur on the right to privacy was appointed (initially Joseph A. Cannataci, since 2021 Ana Brian Nougrères). As part of the mandate granted by the Human Rights Committee,[108] the Rapporteur has significant advisory functions, gathering information on data protection law and practice, but also reporting on violations of Art. 12 UDHR and Art. 17 ICCPR. In early 2018, Cannataci submitted a “Working Draft Legal Instrument on Government-led Surveillance and Privacy”[109], which has however been called “unnecessary” by the members of the Human Rights Committee.[110]

Under the umbrella of the UN, there is also the International Labour Organisation’s Code of Practice.[111] It is primarily aimed at employers and employees, but can also be seen as a supplement to existing legal provisions or as a predecessor to a legal regulation. This was appropriate because the small number of national data protection laws and the sometimes considerable differences between national labour laws made all efforts to agree on internationally binding rules difficult.

The code of practice focuses on five main issues. First, it specifically includes business and personnel consultants (13.1), to whom important parts of recruitment are increasingly being shifted. Second, it does not see the consent of the data subject as a fully-fledged alternative to legal regulations and ties the permissibility of processing to a legal regulation in a number of explicitly enumerated cases, such as the use of medical (6.7) or genetic data (6.12). Third, it requires a regular review of the processing modalities with the aim of reducing access to personal data as much as possible (5.7), fourth, it restricts the processing modalities of employee representatives (10.10). Finally, it emphasises the necessity of collective-law regulations (12.2).

 

 

[…]

 

 

 

[1] GVBl. I 1970, 625; see Reh, Gegenstand und Aufgabe des Datenschutzes in der öffentlichen Verwaltung (1974), especially 23 et seq.; Simitis, Zwanzig Jahre Datenschutz in Hessen – eine kritische Bilanz, Anhang zu Hessischer Datenschutzbeauftragte, 19. Tätigkeitsbericht, 138 et seq.

[2] See also Abel, ‘Geschichte des Datenschutzrecht‘ in Roßnagel (ed), Handbuch Datenschutzrecht. Die neuen Grundlagen für Wirtschaft und Verwaltung (2003), mn. 13 et seq.

[3] Cf. for instance Hessische Zentrale für Datenverarbeitung, Großer Hessenplan. Entwicklungsprogramm für den Ausbau der Datenverarbeitung in Hessen (1970), 21 et seq.; Hessian Parliament, The Explanatory Memorandum for the 1st Hessian Data Protection Act, Hessian Parliament printed matters 6/3065, 7 et seq.

[4] Cf. for instance Simitis, NJW 1971, 673 (676); Bull, NJW 1979, 1177 (1178).

[5] Cf. for instance Seidel, NJW 1970, 1581 (1583); Kamlah, DÖV 1970, 361 (364); Simitis, NJW 1971, 673 (676); Steinmüller et al., Grundfragen des Datenschutzes (1971), 34, 71 et seq.

[6] Geminn/Roßnagel, JZ 2015, 703; see Bygrave, Scandinavian Studies in Law 56 (2010), 166 (167 et seq.); Bieker, The Right to Data Protection (2022), 178 et seq.

[7] As three examples of many, Westin, Privacy and freedom (1967); Simitis, NJW 1971, 673; Simitis, U. Penn. L. Rev. 135 (1987), 707.

[8] Cf. from the early days, e.g. Bull, Verwaltung durch Maschinen (1964); Grimmer, Informationstechnik in öffentlichen Verwaltungen (1986); Reinermann et al., Neue Informationstechniken – Neue Verwaltungsstrukturen (1988).

[9] Hessian Parliament printed matters, 6/3065, 7.

[10] Cf. e.g. the explanatory memoreandum of the government draft of a Federal Data Protection Act, Bundestag printed matters 7/1027, 22; Sasse, ‘Persönlichkeitsrecht und Datenschutzgesetzgebung in Deutschland’ in Triffterer/von Zezschwitz (eds), Festschrift für Walter Mallmann (1978), 213; Vogelgesang, Grundrecht auf informationelle Selbstbestimmung? (1987), 39 et seq., each with further references.

[11] See especially Loschelder, Der Staat 1981, 359 (368); in contrast Mallmann, Zielfunktionen (1977), 24 et seq.; Simitis, ‘„Sensitive Daten“ – Zur Geschichte und Wirkung einer Fiktion‘ in Brem/Druey/Kramer/Schwander (eds), Festschrift zum 65. Geburtstag von Mario M. Pedrazzini (1990), 469 (482 et seq.). Concerning the concept of systematic and individual data protection Bieker, The Right to Data Protection (2022), 180 et seq.

[12] Thus as early as Westin, Privacy and freedom (1967), 324 et seq.; Miller, Der Einbruch in die Privatsphäre (1973), 254 et seq. A position later given a new justification by Posner as part of an economic analysis of the law, see Posner, Economic analysis of law (9th Edition, 2014), 46; Posner, Ga. L. Rev. 12 (1978), 393; Posner, Overcoming Law (1995), 532, and further developed, mainly on the basis of his arguments, cf. e.g. Rule/Hunter, ‘Towards Property Rights in Personal Data’ in Bennett/Grant (eds), Visions of privacy: Policy choices for the digital age (1999), 168; Janger, Hastings Law Journal 54 (2003), 899; and Kilian, CR 2002, 921; Ladeur, DuD 2000, 12.

[13] On the current discussion → mn. 184; for a fundamental criticism cf. Simitis, NJW 1984, 398 (400); Vogelgesang, Grundrecht auf informationelle Selbstbestimmung? (1987,) 141 et seq.; Rogall, GA 132 (1985), 11; Trute, ‘Verfassungsrechtliche Grundlagen’ in Roßnagel (ed), Handbuch Datenschutzrecht. Die neuen Grundlagen für Wirtschaft und Verwaltung (2003), mn. 19 et seq.; but also Roßnagel, ZRP 1997, 30; Weichert, ‘Wem gehören die privaten Daten?‘ in Taeger/Wiebe (eds), Informatik – Wirtschaft – Recht. Regulierung in der Wissensgesellschaft. Festschrift für Wolfgang Kilian zum 65. Geburtstag (2004), 281; Buchner, Informationelle Selbstbestimmung im Privatrecht (2006), 207 et seq., 230.

[14] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfGE 65, 1.

[15] See BGE 140 I 2, mn. 9 et seq.; Erdösová, Public Governance, Administration and Finances L. Rev. 4 (2019), 16 (17 et seq.); Thouvenin, Journal of Intellectual Property, Information Technology and Electronic Commerce Law 12 (2021), 246 each with further references; cf. also Hornung/Schnabel, Computer Law & Security Report 25 (2009), 84.

[16] See Simitis, KritV 83 (2000), 359.

[17] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfGE 65, 1 (41 et seq.).

[18] See e.g. Steinmüller et al., Grundfragen des Datenschutzes (1971); Podlech, Datenschutz im Bereich der öffentlichen Verwaltung (1973); Podlech, DVR 5 (1976), 23; Benda, ‘Privatsphäre und „Per­sön­lich­keits­pro­fil“. Ein Beitrag zur Datenschutzdiskussion’ in Leibholz/Faller/Mikat/Reis (eds), Menschenwürde und freiheitliche Rechtsordnung (1974), 23; Gallwas, Der Staat 18 (1979), 507; Bull, Ziele und Mittel des Datenschutzes (1981); see also Hornung, Grundrechtsinnovationen (2015), 266 et seq.

[19] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (43).

[20] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (42).

[21] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (43); this term was probably first used in Steinmüller et al., Grundfragen des Datenschutzes (1971), 93; on the origins of the term and the history of innovation with respect to the right to self-determination, see Hornung, Grundrechtsinnovationen (2015), 266 et seq.

[22] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (44), as well as 46.

[23] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (44).

[24] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (44).

[25] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (46).

[26] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (45).

[27] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (45).

[28] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfG 65, 1 (46).

[29] See only a little more than six months later BVerfG 17.7.1984 – 2 BvE 11/83 – BVerfGE 67, 100 (143 et seq.); for more on the spread in Germany and (to a limited extent) abroad Hornung, Grund­rechtsinnovationen (2015), 273 et seq. with further references.

[30] On the legislative texts cf. Dammann/Mallmann/Simitis (eds), Die Gesetzgebung zum Datenschutz (1977), 129 et seq.

[31] See especially Dohr, ÖVD 11 (1974), 513; Stadler, ‘Das österreichische Datenschutzrecht‘ in Pawlikowsky (ed), Datenschutz. Leitfaden & Materialien (1979), 46.

[32] See Seipel, ABD och juridik (1975); Flaherty, Protecting Privacy in Surveillance Societies (1989), 93 et seq.

[33] Cf. also Bing, Comparative Law Yearbook 2 (1978), 149 (151 et seq., 167).

[34] See especially Westin, Privacy and Freedom (1967); Miller, Der Einbruch in die Privatsphäre (1973); Flaherty, Protecting Privacy in Surveillance Societies (1989), 305 et seq.; Schwartz/Reidenberg, Data Privacy Law (1996), 6 et seq.; Schwartz, American Journal of Comparative Law 37 (1989), 675; Schwartz, RDV 1989, 153.

[35] As one example of many, Privacy Protection Study Commission, Personal Privacy in an Information Society – The Report of the Privacy Protection Study Commission (1977), 41 et seq.

[36] See Schwartz/Reidenberg, Data Privacy Law (1996), 286 et seq.

[37] See Privacy Protection Study Commission, Personal Privacy in an Information Society – The Report of the Privacy Protection Study Commission (1977), 277 et seq., 393 et seq., 445 et seq.

[38]Privacy Protection Study Commission, Personal Privacy in an Information Society – The Report of the Privacy Protection Study Commission (1977), especially 155 et seq., 223 et seq., 445 et seq., 497 et seq. and Appendix 4: The Privacy Act of 1974: An Assessment; see also Schwartz/Reidenberg, Data Privacy Law (1996), 352 et seq.

[39] See especially Lucas, Le droit de l’informatique (1987), 35 et seq.; Delahaie/Paoletti, Informatique et libertés (1987); Huet/Maisl, Droit de l’informatique et des télécommunications (1989), 133 et seq.; and CNIL, Dix ans d’informatique et libertés (1988).

[40] It had a considerable influence on legislative activity in the following years, e.g. the data protection laws of Iceland and Israel (both 1981), the United Kingdom (1984), Finland (1987), Ireland, Australia and Japan (all 1988), Portugal (1991) and Switzerland (1992).

[41] Cf. Panagiotides, Der Data Protection Act 1984 (1998), 151 et seq., 170 et seq.; Savage/Edwards, A Guide to the Data Protection Act 1984 (1984), 8; Niblett, Data Protection Act 1984 (1984), 8.

[42] Cf. also Flaherty, Protecting Privacy in Surveillance Societies (1989), 395.

[43] See CNIL, Dix ans d’informatique et libertés (1988), p. 70 et seq.; Huet/Maisl, Droit de l’informatique et des télécommunications (1989), 171 et seq.; Lucas, Le droit de l’informatique (1987), 48 et seq.

[44] As one example for many, CNIL, 3e rapport d’activité 1982 (1983), 56 et seq.

[45] See Flaherty, Protecting Privacy in Surveillance Societies (1989), 344 et seq.; Strong, Software Law Journal 2 (1988), 391.

[46] Cf. also Schwartz, American Journal of Comparative Law 37 (1989), 697; Schwartz, RDV 1989, 153 (153 et seq.); Schwartz, Hastings Law Journal 43 (1992), 1322.

[47] See Flaherty, Protecting Privacy in Surveillance Societies (1989), 310 et seq.; Rotenberg, Government Information Quarterly 8 (1991), 79; Schwartz, Hastings Law Journal 43 (1992), 1374.

[48] See e.g. Simitis in Simitis, Introduction, mn. 144 et seq. with further references (especially on the Netherlands, Sweden, Finland and Canada).

[49] See also Flaherty, Protecting Privacy in Surveillance Societies (1989), 380 et seq.; Simitis, U. Penn. L. Rev. 135 (1987), 707 (742).

[50] Italy, end of 1996, Greece, beginning of 1997, cf. Mitrou, RDV 1998, 56.

[51] Cf. Simitis, ‘Vom Markt zur Polis. Die EU-Richtlinie zum Datenschutz‘ in Tinnefeld/Philipps/Heil (eds), Informationsgesellschaft und Rechtskultur in Europa (1995), 51 (53 et seq.).

[52] Cf. Dammann/Simitis in Dammann/Simitis, Introduction, mn. 29, Art. 25, mn. 8 et seq.; Simitis, Collected Courses of the Academy of European Law 1997, Vol. VIII – 1 (2001), 95 (118 et seq.).

[53] Thus e.g. OJ 43 (2000) L 215, 1 (Switzerland), OJ 45 (2002) L 2, 13 (Canada), OJ 46 (2003) L 168, 19 (Argentina), OJ 54 (2011) L 27, 39 (Israel), OJ 55 (2012) L 227, 11 (Uruguay), OJ 56 (2013) L 28 (New Zealand).

[54] Case C-39/72, 7.2.1073, Commission v Italy, ECLI:​EU:​C:​1973:13, mn. 16 et seq.; Case C-34/73, 10.10.1973, Fratelli Variola S.p.A. v Amministrazione italiana delle Finanze, ECLI:​EU:​C:​1973:101, mn. 11; see Benecke/Wagner, DVBl. 2016, 600 (604 et seq.); on exceptions → mn. 185.

[55] For more details on the following Simitis in Simitis, Introduction, mn. 151 et seq.

[56] Council of Europe/Committee of Ministers, Resolution (73) 22 (1973).

[57] Council of Europe/Committee of Ministers, Resolution (74) 29 (1974).

[58] On the background, cf. especially Hondius, Emerging data protection in Europe (1975), 63 et seq.; Hondius, Netherlands International L. Rev. 30 (1983), 112; Henke, Die Datenschutzkonvention des Europarates (1986), 44 et seq.

[59] Council of Europe, Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, European Treaty Series No. 108 (1981), EU DS, EuRAT-Convention, (1981).

[60] Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (1981), No. 14.

[61] Cf. Schweizer, ‘Die Konvention des Europarates und die Richtlinien der OECD zum internationalen Datenschutz’ in Universités de Berne et al. (eds), Informatique et protection de la personnalité (1981), 255 (278); Schweizer, DuD 1989, 542 (543); but also Garzon Clariana, Revista de Instituciones Europeas (1981), 18.

[62] Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 38; cf. also Hondius, Netherlands International L. Rev. 30 (1983), 116; Henke, Die Datenschutzkonvention des Europarates (1986), 60 et seq.; Simitis, RDV 1990, 3 (10); Ellger, Der Datenschutz im grenzüberschreitenden Datenverkehr (1990), 463.

[63]Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 1.

[64] See the list at https://www.coe.int/en/web/conventions/full-list/-/conventions/treaty/108/declarations.eng_fn

[65] See Simitis in Simitis, Introduction, mn. 156 with further references (e.g., Austria, Luxemburg and Switzerland).

[66] See Hondius, Netherlands International L. Rev. 30 (1983), 116; Wochner, Persönlichkeitsschutz im grenzüberschreitenden Datenverkehr (1981), 250; Bergmann, Grenzüberschreitender Datenschutz (1985), 155.

[67] See especially Simitis, ‘„Sensitive Daten“ – Zur Geschichte und Wirkung einer Fiktion’ in Brem/Druey/Kramer/Schwander (eds), Festschrift zum 65. Geburtstag von Mario M. Pedrazzini (1990), 469. Thus the Convention corresponded both with a request made by the Council of Ministers and with the majority of data protection laws previously adopted (Sweden, France, Norway, Luxemburg, Denmark), see Simitis in Simitis, Introduction, mn. 159.

[68] See especially Bing, ‘Classification of Personal Information with respect to the Sensitivity Aspect’ in Selmer (ed), Data Banks and Society (1972), 98 et seq., Bing, ‘“Personal Data System” – A Comparative Perspective on a Basic Concept in Privacy Legislation’ in Bing/Selmer (eds), A Decade of Computers and Law (1980), 72 (78 et seq.).

[69] BVerfG 15.12.1983 – 1 BvR 209/83 – BVerfGE 65, 1 (45); Simitis, DVR 1973, 143; Simitis, ‘„Sensitive Daten“ – Zur Geschichte und Wirkung einer Fiktion’ in Brem/Druey/Kramer/Schwander (eds), Festschrift zum 65. Geburtstag von Mario M. Pedrazzini (1990), 469 → mn. 58.

[70] Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 48.

[71] Council of Europe, Additional Protocol of 8.11.2001 to Convention No. 108 of 1981, European Treaty Series Nr. 181, EU DS, Eu­RAT-Conv., entered into effect on 1.7.2004.

[72] Cf. also Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 63.

[73] Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 67.

[74] Cf. also Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 69; Henke, Die Datenschutzkonvention des Europarates (1986), 171 et seq.

[75] See also Henke, Die Datenschutzkonvention des Europarates (1986), 172 et seq.; Ellger, Der Datenschutz im grenzüberschreitenden Datenverkehr (1990), 477 et seq.

[76] For criticism cf. also Henke, Die Datenschutzkonvention des Europarates (1986,) 166; Bergmann, Grenzüberschreitender Datenschutz (1985), 177; Ellger, Der Datenschutz im grenzüberschreitenden Datenverkehr (1990), 472.

[77] See also Henke, Die Datenschutzkonvention des Europarates (1986), 197 et seq.; Schweizer, DuD 1989, 542 (544).

[78] See the resolution of the Ministers of Justice on data protection and privacy in the third millennium, 30th Council of Europe Conference of Ministers of Justice (Istanbul, Turkey, 24. – 26.11.2010), Resolution No. 3 on data protection and privacy in the third millennium, MJU-30 (2010) RESOL. 3 E.

[79] Council of Europe/The Consultative Committee of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108), Proposition of a Modernisation, DG I – Human Rights and Rule of Law, T-PD_2012_04_rev4_E.

[80] See for an evaluation e.g. Lauer, Informationshilfe im Rahmen der polizeilichen und justiziellen Zusammenarbeit in Strafsachen (2017), 429.

[81] As of 30.9.2022, there are 19 ratifications and a further 25 signatures without ratification, see https://www.coe.int/en/web/conventions/full-list?module=signatures-by-treaty&treatynum=223.

[82] Council of Europe/Committee of Ministers, On Regulations for Automated Medical Data Banks, Recommendation, No. R(81)1 of 23.1.1981; later replaced by Recommendation No. (97)5 of 13.2.1997.

[83] The recommendations can be found at https://www.coe.int/en/web/data-protection/committee-of-ministers.

[84] See Simitis, CR 1991, 161 (163 et seq.).

[85] See Simitis, CR 1991, 161 (167 et seq.).

[86] Cf. Council of Europe, Rules of procedure for the meetings of the Ministers’ Deputies, 4th rev. ed. 2005 Strasbourg, Art. 9 No. 1 in conjunction with 10 No. 2 c.

[87] This applies to Albania, Andorra, Armenia, Azerbaijan, Bosnia and Herzegovina, Georgia, Monaco, Montenegro, Moldova, Russia, San Marino, Serbia, Macedonia, Turkey, and Ukraine.

[88] So far these are Mauritius, Senegal, Tunisia, and Uruguay.

[89] On the legal implications for data transfers, see Wittershagen, The Transfer of Personal Data from the European Union to the United Kingdom post-Brexit (2023).

[90] The first approaches were drawn up as early as 1974; see OECD, Policy Issues in Data Protection and Privacy, Proceedings of the OECD-Seminar 24th to 26th June, 1974, OECD Informatics Studies 10 (1976); for more details Hondius, Emerging Data Protection in Europe (1975), 57 et seq.; Gassmann, ‘The Activities of the OECD in the Field of Transnational Data Regulation’ in Online Conferences Ltd. (ed), Data Regulation – European and Third World Realities (1978), 177; Kirby, Stanford Journal of International Studies 1980, 19; for more details on the historical development, Simitis in Simitis, Introduction, mn. 184 et seq.

[91] See Bing, Michigan Yearbook of International Legal Studies 5 (1984), 271 (282 et seq.).

[92] Gassmann, ‘The Activities of the OECD in the Field of Transnational Data Regulation’ in Online Conferences Ltd. (ed), Data Regulation – European and Third World Realities (1978), 177; as well as Burkert, ‘Internationale Grundlagen’ in Roßnagel (ed), Handbuch Datenschutzrecht, mn. 23. This objective reoccurs in many OECD regulatory proposals and is also adapted to the specific context and the specific approach.

[93] See OECD, Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data, Document C (80) 58 (Final) of 1.10.1980. Cf. also OECD, Recommendation on Cross-border Cooperation in the Enforcement of Laws Protecting Privacy (2007).

[94] See also Schweizer, ‘Die Konvention des Europarates und die Richtlinien der OECD zum internationalen Datenschutz’ in Universités de Berne et al. (eds), Informatique et protection de la personnalité (1981), 255 (274); Patrick, Jurimetrics Journal 21 (1981), 407.

[95] OECD, Recommendation of the Council concerning Guidelines governing the Protection of Privacy and Transborder Flows of Personal Data, C(0)58/FINAL, as amended on 11.6.2013 by C(2013)79.

[96] Council of Europe, Explanatory Report on the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, No. 39.

[97] OECD, Guidelines for Cryptography Policy (1997); but see also OECD, Recommendation concerning Guidelines for the Security of Information Systems, OECD Document C (92) 188 (Final) of 26.1.1992; and OECD, Working Party on Information Security and Privacy, Radio Frequency Identification (RFID): A Focus on Information and Privacy, DSTI/ICCP/REG (2007) 9/Final.

[98] OECD, Transborder Data Flow Contracts in the wider framework of mechanisms for privacy protection on global networks, DSTI/ICC/REG(99)15 /Final of 21.9.2000.

[99] See especially Hondius, Emerging Data Protection in Europe (1975), 59 et seq.; see also Simitis in Simitis, Introduction, mn. 192 et seq. with further references.

[100] UN, Resolution 2450 (XXIII) adopted by the General Assembly: Human Rights and Scientific Technological Developments, 1748th Plenary Meeting, 19.12.1968, General Assembly, Twenty-Third Session, 54.

[101] On the historical details, see Simitis in Simitis, Introduction, mn. 193 et seq. with further references; see especially UN, Economic and Social Council, Human Rights and Scientific and Technological Developments, Uses of electronics which may affect the rights of the person and the limits which should be placed on such uses in a democratic society, Report of the Secretary General, 31.1.1974, UN-Doc. E/CN. 4/1192.

[102] UN, Economic and Social Council, Commission on Human Rights, Sub-Commission on Prevention of Discrimination and Protection of Minorities, Thirty-Eighth Session, Item 10 of the Provisional Agenda: Human Rights and Scientific and Technology Developments, Draft Guidelines for the Regulation of Computerized Personal Data Files. Report Submitted by Mr. Louis Joinet, 23.6.1985, UN-Doc. E/CN. 4/Sub.2/1985/21.

[103] UN, Economic and Social Council, Commission on Human Rights, Sub-Commission on Prevention of Discrimination and Protection of Minorities, Fortieth Session, Item 11 of the Provisional Agenda: Human Rights and Scientific and Technology Developments, Guidelines for the Regulation of Computerized Personal Data Files. Final Report Submitted by Mr. Louis Joinet, Special Rapporteur, 21.7.1988, UN-Doc. E/CN. 4/Sub.2/1988/22.

[104] UN, Guidelines on the Use of Computerized Personal Data Flow, Resolution 44/132, 14.12.1990, UN-Doc. E7CN. 4/Sub.2/1988/22.

[105] UN, Guidelines on the Use of Computerized Personal Data Flow, Resolution 44/132, 14.12.1990, UN-Doc. E7CN. 4/Sub.2/1988/22, Part B, mn. 3.

[106] This was expressly pointed out by the rapporteur Joinet, inter alia in a lecture to the 12th International Conference of Data Protection Officers, published in CNIL, 11e rapport d’activité 1990 (1991), 77.

[107] UN Resolutions 68/167 and 69/166; on the activities of the UN see Hullmann/Masloch/Niemann/Özbek, VN 2015, 125.

[108] Human Rights Council Resolution 28/16; for more details http://www.ohchr.org/EN/Issues/Privacy/SR/Pages/SRPrivacyIndex.aspx.

[109] See https://www.ohchr.org/Documents/Issues/Privacy/SR_Privacy/2018AnnualReportAppendix7.pdf.

[110] https://gpil.jura.uni-bonn.de/2018/06/no-need-legal-instrument-electronic-surveillance-privacy/; https://www.ip-watch.org/2018/03/07/un-rapporteur-privacy-rebuffed-surveillance-oversight-negotiations/.

[111] ILO, Protection of worker’s personal data. An ILO Code of practice (1997).

Leave a Comment

Articles’ list