The Business & Technology Network
Helping Business Interpret and Use Technology
«  

May

  »
S M T W T F S
 
 
 
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
 
 
 
 
 
 
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
31
 
 

ChatGPT’s ‘hallucination’ issue hit with privacy complaint

DATE POSTED:April 29, 2024
ChatGPT's 'hallucination' issue hit with Austrian privacy complaint.

OpenAI has been hit with another complaint, after advocacy group NOYB accused it of failing to correct inaccurate information disseminated by its AI chatbot ChatGPT, potentially violating EU privacy regulations.

According to Reuters, NOYB reported that the complainant in their case, a public figure, asked about his birthday through ChatGPT but received incorrect information repeatedly instead of being informed by the chatbot that it lacked the necessary data.

The group also stated that the Microsoft-backed firm denied the complainant’s requests to correct or delete the data, claiming that data correction was not possible, and failed to provide any details regarding the data processed, its sources, or its recipients.

NOYB reported that it had issued a complaint with the Austrian data protection authority, urging an inquiry into OpenAI’s data processing practices and the steps taken to guarantee the precision of personal data managed by the company’s expansive language models.

Maartje de Graaf, NOYB data protection lawyer, said in a statement: “It’s clear that companies are currently unable to make chatbots like ChatGPT comply with EU law, when processing data about individuals.

“If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. The technology has to follow the legal requirements, not the other way around,” she said.

In the past, OpenAI has acknowledged that ChatGPT “sometimes writes plausible-sounding but incorrect or nonsensical answers.” However, it has said it is attempting to fix this “challenging” issue.

How ‘hallucinating’ chatbots could affect GDPR rules

Some of the first instances of “hallucination” of the chatbots were reported in April 2023. This phenomenon occurs when chatbots and/or people see what isn’t there. However, this also puts the technology on a potential collision course with the EU’s General Data Protection Regulation (GDPR), which regulates the processing of personal data for users in the region.

For particularly severe violations, companies can be fined up to 20 million euros or up to 4 per cent of their total global turnover from the preceding fiscal year, whichever is higher. Data protection authorities also have the power to force changes in how information is processed, meaning that GDPR could revise how generative AI operates within the EU.

In January, OpenAI’s ChatGPT was also accused of breaching privacy rules by an Italian regulator in a follow-up to a probe last year that included a short ban for the application.

Featured image: Canva

The post ChatGPT’s ‘hallucination’ issue hit with privacy complaint appeared first on ReadWrite.