OpenAI has introduced a new setting enabling users to turn off their chat history on ChatGPT, a move that can limit the data accessed by the company to train its AI models. “Conversations that are started when chat history is disabled won’t be used to train and improve our models, and won’t appear in the history sidebar. These controls, which are rolling out to all users starting today, can be found in ChatGPT’s settings and can be changed at any time,” the company stated in a blogpost. It has also informed that when chat history is disabled, new conversations will be retained for 30 days and will only be reviewed when needed to monitor for abuse, before permanent deletion. Currently, users have to fill out an opt-out form if they do not want their data to be used for improving the model’s performance. The company has also added a new 'Export' option to allow users to export their chat data and “understand what information ChatGPT stores”. Users will receive a file with the conversations in their mail. Why it matters: OpenAI’s new privacy settings add to the user choice and provides an easy way to opt-out of the AI training process. The fact that the company will review such chats for abuse without using them for training the model if the user does not consent to it, indicates a move to allay growing criticism about how ChatGPT violates user privacy. Following ChatGPT’s release, the unlicensed use of copyrighted works also…
