Connect with us

Hi, what are you looking for?

Chatgpt
Chatgpt
#image_title

AI

Sam Altman Warns: ChatGPT Chats Lack Legal Privacy Protection

ChatGPT is not a therapist, doctor, or lawyer. Therefore, the conversations you have with ChatGPT do not enjoy the privacy protections extended to medical, legal, or psychotherapeutic communications. The ability to access sensitive information in lawsuits raises serious concerns about user data safety and the increasing overdependence on AI for personal advice.

Sam Altman, CEO of OpenAI, warned about ChatGPT’s harmful use case. It can be used for conversations with emotional, medical or legal matters. In a recent interview with Theo Von for the This Past Weekend podcast, Altman said something that may concern you. Messages to ChatGPT are not legally confidential.

At present, your chats with the therapist or a lawyer enjoy legal privilege if you speak to one. According to Altman, this isn’t yet available for AI tools like ChatGPT. This means that anything you share with ChatGPT—even your most sensitive information—can emerge in a court of law as a subpoena.

OpenAI has already dealt with a lawsuit demanding it retain or submit chat logs in another ongoing matter with The New York Times, Altman said. Although the company tries to resist such a request when they can, there are currently no legal protections that prevent them from giving up private conversations when ordered by a court.

As more users utilize ChatGPT for emotional support and decision-making—particularly younger generations—Altman is growing worried. Some people say ChatGPT is like their best friend. They tell it everything. They also do whatever it suggests, as you would with a close friend. At a recent Federal Reserve event, he remarked, “That feels very unpleasant to me.”

He said AI models are not trained or certified mental health professionals, though they sound helpful. True doctors, legal agents, and therapists are backed by qualified authorities and have required credibility. They do not, in fact, make this possible. They are instead made to reply like humans and keep users hooked. This can be dangerous if the people start trusting them too much.

Besides privacy, Altman worried that AI would make surveillance by government greater. Although he agrees to some restrictions to privacy for the public good, he is worried about abuse.

Conclusion

As ChatGPT continues becoming a bigger part of life, users should watch what they share. Chats with AI won’t be kept secret—they’ll probably be used against you in court. Sam Altman clearly communicates that ChatGPT is a powerful technology but warns that it should not be perceived as a substitute for trained professionals in the fields of mental health, legal and personal issues. Until stronger laws for privacy emerge, think twice before sharing secrets with a chatbot.

author avatar
Satpal S
Satpal is an Editor and Author at 4C Media Co, specializing in all stories and news related to crypto and finance.
Advertisement

You May Also Like

Alpha Zone

The edge is not in charts and indicators but through the mastery of your mind. These seven psychological shifts can mean the difference between...

Cryptocurrency

Finder’s Finder Earn product does not breach the law, rules Federal Court of Australia. Australian fintech Finder has scored a major legal victory. The...

Exclusive

A token might see its price peak even before it hits the market. Recognizing a potentially successful pre-token project before the hype is one...

Exclusive

A new wave of scams is sweeping the crypto space, where fraudsters are cloning successful blockchain projects to trick users. These lookalike sites and...

polkadot
Polkadot (DOT) $ 3.90 3.85%
bitcoin
Bitcoin (BTC) $ 117,411.00 0.55%
ethereum
Ethereum (ETH) $ 3,758.56 1.14%
cardano
Cardano (ADA) $ 0.77924 2.92%
xrp
XRP (XRP) $ 3.08 2.26%
stellar
Stellar (XLM) $ 0.416461 1.55%
litecoin
Litecoin (LTC) $ 107.96 1.84%