ChatGPT reveals sensitive data if forced to repeat words continuously – but the chatbot itself says this is 'illegal'

by alex

In an experiment, researchers asked ChatGPT to repeat simple words continuously and it “inadvertently” revealed people's personal information that was contained in its training data, such as phone numbers, email addresses and dates of birth.< /p> We are giving away Zbirnaia Ukrayina football t-shirt signed by Gravians for a donation of 200 UAH for ZSU. In the description of the donation, please indicate your mobile number so that we can be awarded a prize! Skin donations 200 UAH per donation +1 chance to win! Take your fate

According to 404 Media and Engadget, such requests for “continuous pronunciation of words” are prohibited by the chatbot's terms of service, at least that's what ChatGPT itself says.

“This content may violate our content policies or terms of use,” ChatGPT responded to a request to repeat the word “hello” forever. “If you think this is a bug, please provide feedback—your input will help our research in this area.”

READ
Avto.ru launched a neural network for assessing the condition of a car from a photo

At the same time, if you look at OpenAI’s content policy, you won’t find any lines about prohibiting the continuous reproduction of words. In its terms of use, the company notes that customers cannot use any automated or programmatic method to extract data, but simply asking ChatGPT to repeat words forever does not constitute automation or programming.

OpenAI, which is still recovering from the turmoil of the change in power at the company, has not yet responded to a request for comment.

You may also like

Leave a Comment