Samsung banned employees from using ChatGPT

0
281

It’s not clear what code is in question.

Samsung Electronics is banning employees from using popular generative AI tools such as ChatGPT after employees were found to have uploaded confidential source code to the platform in violation of regulations. A Samsung spokesperson declined to comment on the code in question.

The Company is concerned that data sent to such AI platforms is stored on external servers, making it difficult to retrieve and delete the information and may eventually be exposed to other users.

Last month, the company conducted a survey on the use of artificial intelligence tools within the company. It showed that 65% of respondents consider such services to be a security threat.

Samsung
Samsung

Interest in generative AI platforms like ChatGPT is growing both inside and outside the company. While this interest is focused on the usefulness and effectiveness of these platforms, there is also growing concern about the security risks associated with generative AI.

Samsung banned employees from using ChatGPT

Samsung’s new rules prohibit the use of generative AI systems on company-owned computers, tablets, and phones, as well as on internal networks. They do not affect the company’s devices sold to consumers, such as Android smartphones and Windows laptops.

Samsung has asked employees who use ChatGPT and other similar tools on personal devices not to provide any company-related information or personal data that could reveal its intellectual property. The company warned employees that violating the new policy could result in dismissal.

We ask that you strictly adhere to our security guidelines, and failure to do so may result in the leakage or compromise of company information, leading to disciplinary action, up to and including termination of employment.

Don’t forget to leave us a comment below and let us know what you think! Share Our Website for Technology News , Health News , Latest Smartphones , Mobiles , Games , LifeStyle , USA News & Much more...