Microsoft on Tuesday announced a more secure version of its AI-powered Bing specifically for businesses and designed to assure professionals they can safely share potentially sensitive information with a chatbot.

With Bing Chat Enterprise, the user鈥檚 chat data will not be saved, sent to Microsoft鈥檚 servers

鈥淲hat this [update] means is your data doesn鈥檛 leak outside the organization,鈥 Yusuf Mehdi, Microsoft鈥檚 vice president and consumer chief marketing officer, told CNN in an interview. 鈥淲e don鈥檛 co-mingle your data with web data, and we don鈥檛 save it without your permission. So no data gets saved on the servers, and we don鈥檛 use any of your data chats to train the AI models.鈥

Since ChatGPT launched late last year, a new crop of powerful AI tools has offered the promise of making workers more productive. But in recent months, some businesses such as among its employees, citing security and privacy concerns. Other large companies have taken similar steps over concerns around sharing confidential information with AI chatbots.

In April, regulators in Italy issued on ChatGPT in the country after OpenAI disclosed a bug that allowed some users to see the subject lines from other users鈥 chat histories. The same bug, now fixed, also made it possible 鈥渇or some users to see another active user鈥檚 first and last name, email address, payment address, the last four digits (only) of a credit card number, and credit card expiration date,鈥 OpenAI said in a blog post at the time.

Like other tech companies, Microsoft is racing to develop and deploy a range of AI-powered tools for consumers and professionals amid widespread investor enthusiasm for the new technology. Microsoft also said Tuesday that it will add visual searches to its existing AI-powered Bing Chat tool. And the company said the Microsoft 365 , its previously announced AI-powered tool that helps edit, summarize, create and compare documents across its various products, will cost $30 a month for each user.

Bing Chat Enterprise will be free for all of its 160 million Microsoft 365 subscribers starting on Tuesday, if a company鈥檚 IT department manually turns on the tool. After 30 days, however, Microsoft will roll out access to all users by default; subscribed businesses can disable the tool if they so choose.

RETHINKING AI CHATBOTS FOR THE WORKPLACE

Current conversational AI tools such as the consumer version of Bing Chat send data from personal chats to their servers to train and improve its AI model.

Microsoft鈥檚 new enterprise option is identical to the consumer version of Bing but it will not recall conversations with users, so they鈥檒l need to go back and start from scratch each time. (Bing recently started to enable saved chats on its consumer chat model.)

With these changes, Microsoft, which uses OpenAI鈥檚 technology to power its Bing chat tool, said workers can have 鈥渃omplete confidence鈥 their data 鈥渨on鈥檛 be leaked outside of the organization.鈥

To access the tool, a user will sign into the Bing browser with their work credentials and the system will automatically detect the account and put it into a protected mode, according to Microsoft. Above the 鈥渁sk me anything鈥 bar reads: 鈥淵our personal and company data are protected in this chat.鈥

In a demo video shown to CNN ahead of its launch, Microsoft showed how a user could type confidential details into Bing Chat Enterprise, such as an someone sharing financial information as part of preparing a bid to buy a building. With the new tool, the user could ask Bing Chat to create a table to compare the property to other neighbouring buildings and write an analysis that highlights the strengths and weaknesses of their bid relative to other local bids.

In addition to trying to ease privacy and security concerns around AI in the workplace, Mehdi also addressed the problem of factual errors. To reduce the possibility of inaccuracies or 鈥渉allucinations,鈥 as some in the industry call it, he suggested users write clear, better prompts and check the included citations.