Data Protection: An Overview of the General Data Protection Regulation Occurrences during the month of March

The month of March surprised the world an unusual team-up of tech giants and the Pope in a call for ethical development of Artificial Intelligence, while Google experienced up’s and down’s in regards the right-to-be-forgotten cases in Europe. Furthermore, the EDPB Chair issued a statement clarifying issues on data protection in times of COVID-19. 

Vatican joins IBM, Microsoft in a call for ethical application of facial recognition

The Vatican has taken his part in the world of data and technology by promoting alongside Microsoft and IBM the development of ethical usage of Artificial Intelligence (AI), facial recognition in particular, and by inviting to implement stronger regulations that would ensure AI respects privacy, work reliably and without bias, consider human rights and operate transparently.

A pledge titled ‘’The Rome Call for Ethics’’ was issued and signed by the president of Microsoft Brad Smith, IBM executive vice-president John Kelly and president of the Pontifical Academy for Life Archbishop Vincenzo Paglia. This document sets out the vision on the requirements for AI as an emerging technology and puts humans at the centre of the new technologies, asking for AI to be designed with a focus on the good of the environment and “our common and shared home and of its human inhabitants”.

The document strives to establish AI duty to protect the rights of all humankind, particularly the weak and underprivileged, and invites to adopt regulations to promote transparency and compliance with ethical principles, especially for advanced technologies that have a higher risk of impacting human rights, such as facial recognition. The pledge calls for “duty of explanation” that implies the duty to present not only the decision-making criteria of AI-based algorithmic agents understandable, but also their purpose and objectives.

The document has coined the term ‘’algor-ethics’’, which represents the use of AI as defined by the principles of transparency, inclusion (of everyone’s interests), responsibility (of developers), impartiality and reliability, as well as security and privacy. 

French court cancels the fine of 100,000 imposed on Google in a right to be forgotten case

French administrative Supreme Court Conseil d’État has annulled a fine of 100,000 euros that was imposed in 2016 on Google for failure to delete search results worldwide. The court found that the search results may be demanded to be removed by the French regulator only in Europe and that it does not have a global reach.

The background of the case is a case between the French data protection regulator Commission nationale de l’informatique et des libertés and Google, where in 2015 the regulator demanded that the tech giant removes search result listing to pages that showed undesirable or false information about an individual. After this Google implemented a geo-blocking arrangement that made specific delisted links that may harm reputation of data subjects to be unavailable to persons using the search engine in Europe.

After the ruling the French data protection regulator has said that it will update its policies accordingly.

Swedish data protection authority imposes a 7 million fine on Google

While Google may have seen success this month in one country it has failed in another. The Swedish Data Protection Authority has issued a fine of approximately 7 million euro (75 million Swedish kronor) to Google LLC. for failure to comply with its obligations in regards to the right to request delisting.

The dispute began when in 2017 the data protection authority concluded an audit on Google’s performance of handling individuals’ right to have search result listings for searches that includes their name removed from Google’s search engine. The authority concluded that certain links must be removed and ordered Google to do so. The data protection authority later found that Google had not fully complied with the order, and therefore has issued the fine. The focus was on two of the search result listing that had to be removed. In one instance Google interpreted too narrowly what web addresses requires removal from the search result listing. In the other case the removal of the search result listing was done with delay.

Furthermore, the data protection authority stated that when Google removes a search result listing, it informs the website to which the link is directed in a way that gives the site-owner knowledge of which webpage link was removed and who had requested the delisting request. This allows the website owner to re-publish the respective webpage on another web address which would be available in Google search, although Google lacks legal basis for provision of such information. Such situation renders the delisting option purposeless, as it might result in data subjects refraining from lodging delisting requests.

Google has announced that it will appeal the fine.

EDPB Chair issues COVID-19 Statement on the processing of personal data

The Chair of European Data Protection Board (EDPB) Andrea Jelinek on 16 March 2020 issued a statement regarding the data protection in the times of COVID-19. In this statement she said that the ‘’data protection rules do not hinder measures taken in the fight against the coronavirus pandemic, however, even in these exceptional circumstances, the data controller must ensure the protection of the personal data of the data subjects. Therefore, a number of considerations should be taken into account to guarantee the lawful processing of personal data’’.

It was emphasized that the GDPR as a regulation of wide angle also provides for rules that apply to situations such as this, namely, the COVID-19 outbreak. The GDPR provides for legal grounds for employers and competent public health organizations in the context of epidemics without obtaining consent from the data subjects. This, in particular, applies where the employer must process personal data for reasons of public interest in the area of public health or to protect vital interests (Art. 6 and 9 of the GDPR) or to fulfil other legal obligations.

The statement addresses also the processing of electronic communication data, particularly mobile location data, for which additional rules may apply. Namely, ePrivacy Directive and the national laws implementing the directive, provides for the principle which implies that the location data may only be used by the operator in case the data is anonymous, or the data subject has given its consent. Furthermore, the public authorities willing to undertake location data processing must first opt for processing in anonymous manner, so that the data is aggregated in a way that it may not be used for identification. This approach could be used to generate reports on the concentration of mobile devices in specific locations. The EDPB representative further stated that in cases where data processing may not be only anonymous Article 15 of ePrivacy Directive allows the states to enact a legislative measure that addresses national and public security as an emergency measure. These emergency measures then must satisfy the principle of necessity, appropriateness and proportionality within a democratic society, and go alongside with appropriate safeguards, e.g., granting people rights to judicial remedy.           

Share this post?

Get in
Touch

+356 2166 1273