
Last week, OpenAI’s CEO Sam Altman said in a podcast that ChatGPT does not provide legal privilege or ‘legal confidentiality’ and ‘we haven’t figured that out yet’. Understandably that’s freaked out some lawyers, both in terms of privacy and also the potential for exposure to subpoenas for information that is shared with the LLM. How big a problem is this?
First, let’s look at what he said.
In the transcript (via Tech Crunch – see link) Altman said: ‘People talk about the most personal [stuff] in their lives to ChatGPT. People use it – young people, especially, use it – as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’
‘And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.’
‘I think that’s very screwed up. I think we should have the same concept of privacy for your conversations with AI that we do with a therapist or whatever – and no one had to think about that even a year ago.’
–
Is This a Problem?
Well, who better to ask than ChatGPT itself. First the subpoena issue.
This is what it told AL (Note: the following are unedited, verbatim responses from ChatGPT – and naturally this should not be seen as legal advice 🙂 )
‘Yes, information shared with ChatGPT can potentially be subpoenaed, but only under very specific circumstances. Here’s a breakdown of what that means in practice:
1. What OpenAI Says About Data Use
OpenAI, the company behind ChatGPT, has a strong privacy policy. According to its privacy policy, OpenAI does not store or use your conversations to train its models if you’re using ChatGPT with a paid subscription and chat history disabled.
2. Legal Subpoenas
A subpoena is a legal document that compels an individual or organization to provide evidence or testify. In theory:
If OpenAI stores your data, it could be compelled to hand it over under a subpoena or court order.
If data is not stored, there is nothing to subpoena.
ChatGPT chats (especially with chat history off) are not publicly available, and OpenAI does not sell your data.But stored conversations (with history on) could be legally accessible in very rare and serious cases (e.g., criminal investigations).
3. Best Practices for Confidentiality
If you’re sharing sensitive, personal, or legally privileged information:
Turn off chat history before starting the conversation.
Do not treat ChatGPT like a lawyer, doctor, or therapist—it does not have legal privilege or confidentiality protections like those professions do.
If you’re handling regulated or confidential data (e.g. client legal info), do not input it into ChatGPT without safeguards.
TL;DR:
Can a court subpoena ChatGPT data? Technically yes, but only if it exists and is relevant.
Does OpenAI store your chats? Not if chat history is off or you’re using certain paid plans with data controls.
Should you put confidential legal info in ChatGPT? No, unless you’re confident in the data protection settings and are not risking client confidentiality.
—
So, it’s saying that basically: yes, it could be a risk, but only if the data is actually still there, which it may well not be.
—
And one more question: is the lack of legal privilege also a wider risk for lawyers using ChatGPT? Here’s what it told AL. (Note: the following are unedited, verbatim responses from ChatGPT – and naturally this should not be seen as legal advice 🙂 )
‘Great question – the short answer is: no, you should not put truly confidential or sensitive information into ChatGPT, unless you’re using a version of the service designed specifically for that level of data privacy (e.g. OpenAI’s enterprise-grade offerings).
Here’s the breakdown:
Standard ChatGPT (Free or Plus)
Not secure for confidential legal, personal, or client data.
Your data is not used to train the model (if chat history & training is off), but it can still be stored temporarily and reviewed in rare cases for abuse monitoring or quality assurance.
No encryption at rest specifically designed for sensitive data.
No legal privilege or compliance with data protection regulations like GDPR/CCPA for sensitive information.
So: Do not share names, addresses, contracts, or client details in the normal consumer version of ChatGPT.
ChatGPT Enterprise or API (Business Use)
If you’re building a legal tech product or using ChatGPT in a professional legal environment, OpenAI offers:
Enterprise ChatGPT or API access
No data ever stored or used for training
Data encryption in transit and at rest
SOC 2 compliance and better audit controls
Can be integrated into your own secure environment
This setup can be safe for confidential information — if your overall system is secure and compliant.’
—
So, as we’ve explored before, using a ‘raw’ LLM is a bad idea, even if OpenAI will most likely never bother to look at your specific inputs. But, if you are using an enterprise approach then you are on safer ground, it would appear. But…always check with someone who knows…!
As to the subpoena point, even ChatGPT says that it’s possible, but that circumstances where that actually happened would be rare. Or so it claims.
Of course – these points are made by the LLM in question and thus should be A) not seen as legal advice, and B) taken with a pinch of salt given that this is the ‘accused’ defending itself.
–
What do you think?
—
( Main pic: Sam Altman, via Wikipedia Creative Commons – CC credit: By TechCrunch – TechCrunch Disrupt San Francisco 2019 – Day 2, CC BY 2.0. )
Discover more from Artificial Lawyer
Subscribe to get the latest posts sent to your email.