Data Protection: An Overview of the General Data Protection Regulation Occurrences during the month of June
In June we saw some developments that could have a notable change in the data protection landscape in Europe. Namely, European institutions have voiced concerns on the adequacy of data transfer to UK in the post-Brexit era. Meanwhile, EDPB has introduced a new register for decisions of cross-border nature. Lastly, Google lost an appeal in a significant case in French High Court, and tech giants withdrew from offering law enforcement facial recognition software.
EU-UK adequacy decision potentially at risk due to UK-USA data protection arrangement
In order to ensure a safe and efficient cross-border data transfers outside the EU from European Union to United Kingdom in the post-Brexit era, safety measures, such as data adequacy agreement, must be taken. However, UK might lose the possibility of having a data adequacy agreement with the EU due to UK having a data access agreement with the United States. The agreement between the said countries was concluded in October 2019 in order to facilitate access to electronic evidence in criminal investigations, and in the context of Brexit, has also raised European Commission’s concerns on personal data of persons in EU being sent off to United States.
With regards to its concerns on the UK-US data access agreement and how it may not fully comply with EU law, the European Data Protection Board (EDPB) on 15 June 2020, informed the European Parliament. The director of EDPB Andrea Jelinek stated that ‘’the EDPB has doubts as to whether the safeguards in the agreement would apply in case of disclosure obligations applicable to providers of electronic communication service or remote computing service under the jurisdiction of the United States, regardless of whether the data requested is located within or outside of the United States.’’
The result of this case is of significant importance, as the free data flow between EU and UK enables data related service exports, which were estimated to be worth over £100 billion in 2018, as well as movement of other goods and services that are, nevertheless, dependent on usage transfer of personal data. Moreover, hindered data flow with EU would mean significant limitations on UK businesses to participate in the whole EU Market altogether. Considering these financial stakes involved, a data adequacy agreement is highly needed by both EU and UK.
European Commission shall need to take these aspects into consideration when the adequacy of the UK’s legal environment in view of data protection is assessed; however, it is not known to what extent the US-UK arrangement shall be considered.
EDPB introduces a new register for publishing of decisions drawn by national supervisory authorities
In order to follow the Article 60 of General Data Protection Regulation, which envisages the so-called One-Stop-Shop procedure, the EDPB has introduced a new register. This Register contains the decisions taken by the authorities as well as summaries of the decisions in English prepared by the EDPB Secretariat.
The aim of this register is to ensure consistent application of GDPR in cases with a cross-border component, such as when the controller or processor is established in more than one Member State, or where processing in question takes place in one State of establishment. However, it substantially affects or might affect individuals in more than one Member State.
The register will be beneficial for the purpose of implementing the One-Stop-Shop arrangement, but also for those interested in gaining knowledge on how the supervisory authorities co-operate in the said cases.
Google loses appeal in the French highest administrative court
Google has lost an appeal in case against France’s data protection authority CNIL in the highest administrative Court of France Conseil d’État (Council of State), and shall be liable to pay up the penalty in the amount of 50 million Euros. The fine was imposed for mainly two GDPR violations, one of them being the lack of transparency in consent obtainment process – failing to provide sufficient information to individuals when giving consent. This breach included both provision of insufficient information, as well as lack of clarity and accessibility to the information provided. The second violation was Google’s lack of legal basis for data processing that was done for the purpose of advertising (ad personalization).
The story began with two non-governmental organisations – None of Your Business (NOYB, led by data protection activist Max Schrems) and La Quadrature du Net (LQDN) – placing complaints on Google’s data processing activities. The complaints in January 2019 resulted in a 50 million Euro fine – the greatest fine for GDPR violation at the time of it being imposed.
As expected, Google did not agree with the fine and appealed it, by stating that the French Data protection authority did not have jurisdiction over Google in Europe, and that Irish Data Protection authority (the Member State harboring the headquarters of Google) was the only one that could be involved in such matters. Furthermore, Google did not agree that the findings of the French authority on Google’s breaches are well grounded, as well as considered the great fine to be disproportionate and calculated without taking into account the criteria set out in Article 83 of GDPR, to which the highest court replied with arguments that law did not require CNIL to provide explanation on the amounts imposed and criteria applied.
Consequently, the French highest court dismissed Google’s appeal. Moreover, it also dismissed Google’s requests to reference the matter for a preliminary ruling in the Court of Justice of the European Union, in particular with respect to matters of the validity criteria for consent and jurisdiction of Member State data protection authorities.
Tech Giants are refusing Police its facial recognition technology
The tech giant Amazon in June announced a one year moratorium on the use of its facial recognition software by the Police for one year. Amazon stated that the government should implement stronger regulations to govern the ethical use of facial recognition technology, and that the company hopes that within this one year, progress will be made towards regulations.
Such announcement followed the IBM’s announcement in which it stated that IBM will no longer participate in the development of facial recognition technology and its market altogether and calls for a national dialogue with respect to the use of facial recognition by the law enforcement.
Furthermore, Microsoft has joined IBM and Amazon, by announcing that it will not sell facial recognition software to law enforcement until the moment a federal law regulating the technology is implemented.
Such tech giants’ decisions have been sparked by the growing pressure placed on the industry by the George Floyd protests for the elimination of police brutality and racial profiling.
While the facial recognition technology of today has significantly improved since its early days, it proved that it still has many significant flaws; namely, studies[ML1] of facial recognition accuracy and reliability, in particular research done by a MIT Media Lab researcher Joy Buolamwini and Microsoft researcher Timit Gebru, show that error rates in the facial recognition software of the big tech giants are far more prevalent with respect to the dark skinned individuals than the white counterparts. For example, one of the studies conducted by the said researchers showed that the system recognized light skinned people almost without errors, however, in case of dark skinned persons, there was a substantial amount of instances of [ML2] the system misidentifying the gender of dark skinned people.
One of the reasons for such discrepancies is the wide use of datasets of white and male individuals when training the software to recognize facial patterns.
Matiss Liepins is Compliance Officer at Erremme Business Advisors Ltd and may be contacted on firstname.lastname@example.org
[ML1]It was meant that ‘studies show’ and it was a merely a reference to studies that expose the flaws.
[ML2]‘Instances’ as cases of system going wrong.