ChatGPT has shifted the paradigm of what a artificial intelligence can achieve and is extremely useful for a wide variety of tasks. However, a large portion of its millions of users forget that by asking the chatbot to summarize important notes or check for errors in their work, it can use that information to train its system and can even display it in other users’ responses. This has come to light with the most recent case involving workers at Samsung.
As the portal points out. The Economist Koreavia Mashableseveral employees of Samsung were unaware of the above detail before sharing confidential company information with ChatGPT. Shortly after the company’s semiconductor division allowed its engineers to use the chatbot, workers leaked secret data to it on at least three occasions.
Problems with ChatGPT and corporate confidentiality.
Apparently, an employee asked ChatGPTchatbot from OpenAIAnother asked OpenAI to check for errors in the source code of a confidential database, another requested the optimization of a code, and a third entered a recorded confidential meeting into the platform to ask it to generate a document based on it.
Reports suggest that, after learning of such leaks, Samsung attempted to limit the scope of future bugs by restricting the length of requests to ChatGPT employees to one kilobyte, or 1024 characters of text. The company is also said to be investigating the three employees in question and has even decided to create its own chatbot to avoid mishaps of this magnitude.
A delicate situation involving ChatGPT
The data policy of ChatGPT states that, unless explicitly excluded by users, the system can use their messages to train its language models. OpenAIthe company that owns the chatbot, urges users not to share sensitive information with ChatGPT in conversations, as it “cannot remove specific messages from the history”.
Support for the artificial intelligence platform indicates that the only way to remove personal information from ChatGPT is to delete the account, a process that can take up to four weeks. The case of the employees of Samsung is yet another example of why it is important to be cautious when using chatbots, something that every user should extend to all their online activity as one never really knows where this data will end up.
We recommend METADATARPP’s technology podcast. News, analysis, reviews, recommendations and everything you need to know about the technological world.