- A federal judge rejected a chatgpt -user’s petition against her order that Openai preserves all chatgpt -chats
- The order followed a request from the New York Times as part of its trial against Openai and Microsoft
- Openai is planning to continue arguing against the order
Openai holds on to all your conversations with chatgpt and possibly sharing them with a lot of lawyers, including those you thought you’ve deleted. It is the result of an order from the federal judge who oversees a trial brought against the Openai of New York Times over copyright infringement. Judge Ona Wang maintained his previous order to preserve all chatta -conversations for evidence after rejecting a proposal from Chatgpt -User Aidan Hunt, one of several from Chatgpt users asking her to resign from the order of privacy and other concerns.
Judge Wang told Openai that “indefinitely” preserve Chatgpt’s output since Times Pointed out that it would be a way to tell if the chatbot has illegally recreated articles without paying the original publishers. But finding these examples means hanging on any intimate, awkward or just private communication that someone had with the chatbot. Although what users write is not part of the order, it is not hard to imagine to prepare who was talking to chat about what personal topic based on what AI wrote. In fact, the more personal discussion, the easier it would be to identify the user.
Hunt pointed out that he had no warning that this could happen until he saw a report on the order in an online forum. And is now concerned that his conversations with chatgpt may be communicated, including “very sensitive personal and commercial information.” He asked the judge to leave the order or change it to omit in particular private content, such as conversations conducted in private state or when medical or legal issues have been discussed.
According to HUNT, Judge [judge] To introduce a nationwide mass surveillance program using a discovery order in a civil case. “
Judge Wang rejected his request because they are not related to the current copyright. She emphasized that it is about preservation, not revealing, and that it is hardly unique or uncommon for the courts to tell a private company to hold on to certain items for litigation. It is technically correct, but understandably a daily person who uses chatgpt, may not be like that.
She also thought that especially not liked the mass surveillance reflection where she quoted the part of Hunt’s petition and threw it with the legal language equivalent of a Diss track. Judge Wang added a ”[sic]”To the quote from HUNT’s filing and a footnote that points out that the petition” does not explain how a court’s document storage order conducting conservation, separation and retention of certain privately owned data from a private company with the limited litigation is or may be a “nationwide mass surveillance program.” It’s not. The judiciary is not a law enforcement agency. “
It is Burn ‘aside, there is still a chance that the order will be lifted or changed after Openai goes to court this week to push back against it as part of the larger paperwork in the trial.
Deleted but not gone
Hunt’s other concern is that no matter how this case goes, Openai will now have the ability to keep chats that users thought were deleted and could use them in the future. There is concern about whether Openai will lean to protect the user’s privacy in relation to legal appropriateness. Openai has so far argued for this privacy and has asked the court for oral arguments to challenge the detention order taking place this week. The company has said it wants to push hard back on behalf of its users. But in the meantime, your chat logs are in limbo.
Many may have felt that writing to Chatgpt is like talking to a friend who can keep a secret. Maybe more now will understand that it still acts as a computer program, and similar to your browser history and Google Search expressions are still in there. At least hopefully there will be more transparency. Even if it is the courts that require AI companies to retain sensitive data, users must be notified by the companies. We should not discover it randomly on a web forum.
And if Openai really wants to protect its users, it could start to offer more granular controls: Clear change to anonymous condition, stronger deletion guarantees and warnings when conversations are preserved for legal reasons. Until then, it may be wise to treat chatgpt a little less as a therapist and a little more like a colleague who might wear a cord.



