Sharing sensitive business data with ChatGPT could be risky

Sharing sensitive business data with ChatGPT could be risky

The furor surrounding ChatGPT remains at a fever pitch as the ins and outs of the AI chatbot’s potential continue to make headlines. One issue that has caught the attention of many in the security field is whether the technology’s ingestion of sensitive business data puts organizations at risk. There is some fear that if one inputs sensitive information — quarterly reports, materials for an internal presentation, sales numbers, or the like — and asks ChatGPT to write text around it, that anyone could gain information on that company simply by asking ChatGPT about it later.

The implications of that could be far-reaching: Imagine working on an internal presentation that contained new corporate data revealing a corporate problem to be discussed at a board meeting. Letting that proprietary information out into the wild could undermine stock price, consumer attitudes, and client confidence. Even worse, a legal item on the agenda being leaked could expose a company to real liability. But could any of these things actually happen just from things put into a chatbot?

To read this article in full, please click here



Support the originator by clicking the read the rest link below.