learn to track

OpenAI Facing Privacy Complaint in Austria Over ChatGPT Generated Misinformation

TL;DR: NOYB files a privacy complaint against OpenAI in Austria over ChatGPT’s inability to correct misinformation about individuals. The complaint cites transparency concerns and potential GDPR violations, with fines up to 4% of OpenAI’s global turnover.

Non-profit privacy rights organisation NOYB has asked the regulator to examine the data processing of ChatGPT and impose a fine on OpenAI for such misinformation.

ChatGPT maker OpenAI is in legal trouble again in the European Union, Austria to be specific, over the generative AI tool’s inability to correct misinformation it generates about individuals, according to a TechCrunch report.

It’s a big problem.

An unnamed public figure sought the help of the non-profit privacy rights organisation NOYB (None of Your Business) to sort out the problem of incorrect birthdates generated by ChatGPT in response to queries about them. NOYB has raised the issue with the Austrian data protection authority under the EU’s General Data Protection Regulation (GDPR), which gives people rights over information about themselves, including the ability to request incorrect personal data to be corrected.

The complaint filed on behalf of the public figure by the organisation alleges that the AI company refused to correct the wrong birth date, stating it was technically impossible to do so and instead offered to restrict and block such prompt responses that mention the said public figure’s name.

Earlier in 2023, ChatGPT was temporarily banned by the Italian Privacy Regulator for lack of transparency and data breach concerns. The regulator wanted the company to reveal how it is using users’ data to generate AI responses. In March this year, Italy again asked OpenAI to elaborate on users’ data usage in the data training of Sora.

What Does This Mean for OpenAI?

The complaint also raises transparency concerns, with NOYB contending that OpenAI cannot explain where the data it generates about people originates from or what personal data ChatGPT stores. Under GDPR, individuals have the right to request such information via a subject access request (SAR). NOYB claims OpenAI failed to adequately respond to the complainant’s SAR.

Speaking about the matter, a data protection lawyer at NOYB, Maartje de Graaf said “If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around.”

Reuters also quoted the lawyer saying “It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law when processing data about individuals.”

The privacy rights organisation has asked the Austrian data protection authority to further investigate the data processing practices of ChatGPT to ensure compliance with EU laws. It has also suggested imposing a fine for non-compliance. At present, GDPR penalties can reach up to 4% of a company’s global annual turnover.

Vik

Add comment

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.