The Garante della privacy, in a controversial decision, ruled that ChatGPT should be blocked in Italy.
The system developed by the OpenAi startupper has not, at least for the time being, passed the controls of the Authority that protects personal data in Italy.
A temporary halt to the algorithm came about on 20 March, when the artificial intelligence system suffered a massive loss of data on users’ conversations and especially on the payment information of subscribers to paid services. Despite the serious problem, OpenAi did not, however, inform users of what had happened, let alone how personal data was handled.
Hence the decision of the Italian authority to proceed against the American company.
Basically, the issue raised by the Guarantor, beyond the problem linked, as mentioned, to the loss of data, relates to the absence of ‘a legal basis that justifies the massive collection and storage of personal data for the purpose of “training” the algorithms underlying the operation of the platform’.
A further worrying issue was also raised, again by the guarantor: the handling of applications by under-13s.
In this case, there is no form of control or verification, so the answers are unfiltered and often unsuitable for such a young audience.
Italy is the first supervisory authority in the world to block the use of ChatGPT on the basis of privacy legislation, legislation that, it should be emphasised, follows the indications of the European data protection regulation.
In light of the issues, the Italian Gdpr action underlines the need to establish whether technological experimentation requires specific privacy rules, especially in relation to the fact that artificial intelligence uses input data to ‘learn’ and give answers to users and to improve algorithms.
All that remains is to wait for formal replies from the European managers of Open Ai, at least in relation to the issue raised.
But it is certain that the world of artificial intelligence is shaping up to be a new, difficult chapter in the protection of users and minors.