ChaptGPT: the Italian Data Protection Authority leads the way and imposes GDPR compliance.

28.11.2023

The Italian Data Protection Authority recently gained international attention for being the first to address the privacy risks of generative artificial intelligence, ordering the temporary restriction of the Italians’ data processing to OpenAI because of the violation of several GDPR rules. With the implementation of several measures to improve the compliance of this AI system with the GDPR, ChatGPT is back online again in Italy.

The privacy risks of the Generative Artificial Intelligence

The rapid spread of Generative Artificial Intelligence (“GAI”) is giving rise to a heated debate on its privacy implications, particularly with regard to the most popular of these systems: ChatGPT, developed by the US company OpenAI.

The GAI, upon which ChatGPT is based, is in fact a technology capable of creating content and simulating human conversations through training its algorithm with large datasets. The compatibility of the GAI with data protection laws is therefore in doubt, particularly concerning the adequate handling of the data it collects.

 

The intervention of the Italian Data Protection Authority

On March 30th the Italian Data Protection Authority (“Italian DPA”) ordered the temporary limitation on the processing of Italian users' data by OpenAI, following an inquiry that revealed the ChatGPT's non-compliance with the GDPR. This measure led OpenAI to deactivate its service throughout the Italian territory.

The main breaches identified by the Italian DPA relate, in particular, to: (i) the absence of a privacy notice for the users and the other data subjects whose data is collected; (ii) the insufficient establishment of a legal basis to justify the collection and storage of data used to nourish the chatbot; (iii) an inaccurate data processing, as the information provided by ChatGPT does not always turn out to be factual and (iv) the lack of an age verification mechanism.

The measures proposed by OpenAI

After several dialogues between OpenAI and the Italian DPA, the company has implemented a set of measures aimed to comply with the Authority's directives and to ensure the privacy of ChatGPT users. More specifically, OpenAI: (i) identified the performance of a contract as the legal basis for processing the data necessary for the service functionality and the legitimate interest as the legal basis for algorithm training purposes; (ii) published a privacy notice on its website to inform the data subjects of the category of data collected, the methods of data processing for the chatbot training and everyone's right to object to such processing. Additionally, OpenAI introduced a module that allows data subjects to exclude their conversations and the respective history from being used in the algorithm training; (iii) implemented mechanisms to enable data subjects to request the deletion of information they deemed incorrect and (iv) introduced a mandatory date of birth field on the service's registration page, with provisions to block registration for users over the age of 13.

The Italian DPA has accepted the measures implemented by OpenAI, recognizing the challenges to reconcile technological advancement with individuals' rights respect, and has consequently suspended the adopted measure. However, the substantive proceeding is currently pending to ascertain whether OpenAI has violated, and in what terms, the privacy rights of individuals residing in Italy.

 

Conclusions

The issue of privacy risks of the GAI is more relevant than ever. Following the ChatGPT case, the EDPB has established a task force to define a common set of principles that DPAs should adhere to when handling such cases. On its part, the Italian DPA has expressed its intention to continue monitoring GAI applications available online, emphasizing however its commitment to promote technological innovation without hindering its development.


Article provided by INPLP member: Chiara Agostini (RP Legal & Tax, Italy)

 

 

Discover more about the INPLP and the INPLP-Members

Dr. Tobias Höllwarth (Managing Director INPLP)

What is the INPLP?

INPLP is a not-for-profit international network of qualified professionals providing expert counsel on legal and compliance issues relating to data privacy and associated matters. INPLP provides targeted and concise guidance, multi-jurisdictional views and practical information to address the ever-increasing and intensifying field of data protection challenges. INPLP fulfils its mission by sharing know-how, conducting joint research into data processing practices and engaging proactively in international cooperation in both the private and public sectors.