- Openais CEO says the use of chatgpt for therapy has serious privacy risks
- Your private chats may be postponed if Openai were to face a trial
- Feeding your private thoughts to an opaque AI is also a risky move
One of the sensations of having an artificial intelligence (AI) assistant as a chatgpt everywhere you go is that people start to lean on it for things it was never meant for. According to Openai CEO Sam Altman, it includes therapy and personal life councils – but it can lead to all kinds of privacy problems in the future.
In a recent episode of the last weekend w/ Theo von Podcast, Altman explained a significant difference between talking to a human therapist and using an AI for mental health support: “Right now, if you are talking to a therapist or a lawyer or doctor about these problems, there is a legal privilege for it. There is a medical patient confident Speaking to chat to chat. ”. “
A potential result of that is that Openai would legally be required to cough these conversations, if it were to meet a trial, altman claimed. Without the legal confidentiality you get when talking to a doctor or a registered therapist, there would be relatively little to stop your private concerns being sent to the public.
Altman added that chatgpt is used in this way by many users, especially young people who may be particularly vulnerable to that kind of exposure. But regardless of your age, the topics of conversation are not the type of content that most people would be happy to see revealed to the wider world.
A risky endeavor
The risk of getting your private conversations opened for control is just a privacy risk that Chatgpt users are facing.
There is also the question of feeding your deeply personal worries and concerns of an opaque algorithm like Chatgpt’s, with the possibility that it can be used to train Openais algorithm and leak back when other users ask similar questions.
That’s one of the reasons many companies have licensed their own ring-fenced versions of AI-Chatbots. Another alternative is an AI like Lumo, built by Privacy Stalwarts Proton and has top -level encryption to protect everything you write.
Of course, there is also the question of whether an AI -as Chatgpt can replace a therapist in the first place. Although there may be some benefits to this, any AI is simply to regurgrate the data it is trained on. No one is capable of original thought, which limits the effectiveness of the advice they can give you.
Whether you choose to open up to Openai, it is clear that there is a privacy ministry that surrounds AI chatbots, whether it means a lack of confidentiality or the danger of having your deepest thoughts used as training data for an inevitable algorithm.
It will require a great deal of effort and clarity before an AI therapist is a significantly less risky endeavor.



