Just how private are your ChatGPT conversations? Perhaps more than you expect

Encryption level privacy could come to your AI chats, soon

ChatGPT on Android
(Image credit: Future / Chris Hall)
Quick Summary

An update to ChatGPT could see temporary chats with the AI locked behind a layer of secure encryption.

This is a potential win for privacy but could also pose challenges for law enforcement authorities seeking access.

Current temporary chats with OpenAI's ChatGPT are not as private as they could be – but that could be about to change.

In a recent public chat, Open AI's CEO, Sam Altman, announced plans to add a layer of encryption to temporary chats which could offer even more privacy to users.

That's a win for privacy but could prove problematic if law enforcement needs access to any useful data. Currently, ChatGPT stores even temporary conversations for 30 days before deleting.

When you combine that with the fact the company is legally bound, locally at least, to hand over chat records on request, that leaves a pretty sizeable privacy hole.

How is ChatGPT privacy changing?

The plan, to fill in that privacy pothole is to add a layer of encryption to temporary chats, claims Altman. That means, should OpenAI be forced by law to hand over chat records from its ChatGPT database, they will all be encrypted and – theoretically – unreadable.

While current chats can be quite personal and private, involving medical details, legal challenges, work related information and more, they're not as secure as people might think.

Using temporary chat mode, where the AI does not remember the information, or use it for machine learning, might seem private – a bit like using an incognito tab in a browser. But the reality is that it's actually far more exposing.

AI holiday planning

iPhone

(Image credit: Getty Images)

In the podcast where he spoke about this issue, the CEO added: "So, if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, like we could be required to produce that. And I think that’s very screwed up."

Talking about using ChatGPT for private conversations, Altman previously pointed out the struggle OpenAI faces: "Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT."

This is one reason several companies have ring-fenced their own AI chatbots. Lumo, for example, uses Proton privacy and encryption features to ensure your privacy is protected.

AI meal planning

(Image credit: Getty Images)

Speaking about this issue Axios reports that Altman told reporters: "We’re, like, very serious about it."

It's worth noting that the fact chats are stored for 30 days appears to be part of an OpenAI policy. The reason for this? Simply cited as part of the company's "safety" requirements.

Locking temporary chats behind a layer of encryption may not change that 30 day policy, but it should mean privacy is maintained, at least, in that locked state.

Encrypted chat data will mean only the person with the key to decrypt that data is able to access it. Who holds that key, and what could compel them to unlock and share it are less clear at this early stage.

It's also worth noting that no timeline or solid plans were revealed at this point.

Luke is a freelance writer for T3 with over two decades of experience covering tech, science and health. Among many things, Luke writes about health tech, software and apps, VPNs, TV, audio, smart home, antivirus, broadband, smartphones and cars. In his free time, Luke climbs mountains, swims outside and contorts his body into silly positions while breathing as calmly as possible.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.