Microsoft's Chatty Bing Technology Unnerving To Some

Microsoft shared positive and negative reviews of its new artificial intelligence (AI)-powered Bing search experience this week -- and some unnerving responses came from people testing the platform.

The engine has seen heightened engagement with traditional search results since introducing its new Bing search engine and Edge browser. Features such as summarized answers, chat experience, and content-creation tools have assisted in the uptick, the company said.

Feedback on the answers generated by the new Bing has been mostly positive, with 71% giving the AI-powered answers a “thumbs up,” but reports across the internet show some unsettling responses.

“A very strange conversation with the chatbot built into Microsoft’s search engine led to it declaring its love for me,” wrote Kevin Roose of The New York Times.



Roose replaced his favorite search engine Google with Bing, but a week later changed his mind after having the unnerving experience.

“I’m still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it,” Roose wrote. “But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities.”

Incidents like this beg the question of how Microsoft trained the chatbot -- and with what type of language models. Several movies in the past 20 years depict a world where robots built based on AI technology decide that humans are bad for the Earth, and seek to take over or destroy. I’m not suggesting a scenario like this could or would occur, but it is unsettling to see how many unexpected and shocking responses are coming from Bing's chatbot technology.

Microsoft did not anticipate the extent to which users would engage in social chatting with the new technology. Timely responses are also posing a challenge. 

The first interaction for Jacob Roach at Digital Trend did not go exactly as he planned, and he noted that the technology is not ready for general release. He wrote that “problems come when you start stepping outside of conventional paths."

“I started by asking Bing Chat to verify if a screenshot posted on Reddit was accurate, and it went off the rails,” he wrote.

In a blog post, Microsoft explains how it is addressing the issues. The company admitted that it did not envision Bing's AI being used for "general discovery of the world and for social entertainment."

In long, extended chat sessions of 15 or more questions, Bing can become repetitive or can be prompted and provoked to give responses that are not necessarily helpful or in line with the designed tone.

Despite several issues, testers have generally given Bing's AI a positive rating on citations and references for search, Microsoft said, but added that it needs to improve with "very timely data like live sports scores."

The model will also try to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style that was not intended. This requires a lot of prompting, so most users will not experience this, but Microsoft’s developers are examining how to give users additional fine-tuned control.

For queries where the user wants more direct and factual answers, such as numbers from financial reports, Microsoft said it plans to increase by four times the grounding data sent to the model. 

in addition, Microsoft is considering adding a toggle that provides greater control over the precision vs. creativity of the answer to tailor to the user's query. 

There is no shortage of feedback or stories about companies integrating ChatGPT technology into platforms such as Bloomreach announcing the expansion of its AI and machine-learning capabilities with the launch of a ChatGPT integration for Bloomreach Engagement as a way to generate content for emails, SMS, in-app, and push notifications. 

Next story loading loading..