ChatGPT macOS Users Surprised by Unencrypted Chat Storage

As a crypto investor and tech enthusiast who values privacy and security above all else, I’m deeply concerned by this recent revelation regarding ChatGPT’s unencrypted chat logs on macOS. The potential for anyone to access my conversation history without my consent is a serious breach of trust that could have far-reaching consequences, especially in the world of cryptocurrency where privacy is paramount.


On macOS, users discovered that their conversation records from ChatGPT were stored as simple text files, sparking significant privacy worries despite Apple’s stringent privacy guidelines.

As a data analyst, I’ve noticed a concern raised by macOS users regarding the storage of conversation logs from ChatGPT. Unlike encrypted files, these logs are saved as plain-text documents on users’ computers, potentially posing a security risk.

Pedro Jose Pereira Vieito first brought attention to this issue in a post on Meta’s Threads. The lack of encryption in storage put users’ chat histories at risk, making them accessible to anyone who gained access to the computer.

According to him, OpenAI chose not to adhere to macOS’s sandboxing feature introduced in Mojave 10.14, despite this system requiring explicit user approval for apps to access private information. By disregarding this security measure, OpenAI bypassed protections intended to shield users’ data from unauthorized third-party app access.

As a researcher studying the development of ChatGPT, I can share that in May, this model became accessible to macOS subscribers. Later in June, on the 25th, the program was made available to non-subscribers. Initially, all chat logs were stored as unencrypted plain-text files directly onto users’ hard drives. However, by July 5, this practice had been changed and the data was secured with encryption.

Every interaction a user has with ChatGPT on their computer could potentially be accessed by others, either through direct access to the device or via indirect means such as malware or phishing attacks.

As a researcher studying operating systems, I would describe “sandboxing” as a privacy-enhancing mechanism built into macOS from Apple. By implementing sandboxing at the kernel level, applications obtained from the App Store are automatically confined to access only approved data and software. This means that any sensitive information remains encrypted and securely isolated, ensuring that it’s not inadvertently exposed.

It’s currently unclear if any users were affected by the perceived glitch, but there was widespread astonishment among social media users and commentators.

In the comments section of an article on The Verge, a user named GeneralLex shared their discovery. They explained that they had extracted the ChatGPT executable from their computer’s memory using Activity Monitor. To their shock, they found that the chat logs were in plain text and unencrypted in the memory.

This event highlights the essential importance of strong privacy safeguards in AI utilizations and external connections on macOS. Although actions are being initiated to resolve the problem, concerns persist regarding data security procedures and the prospective methods for ensuring user privacy in the long run.

Read More

2024-07-05 23:32