Microsoft Reportedly Bringing ChatGPT Features Into Bing

Microsoft’s $1 billion return on investment in OpenAI, which created the ChatGPT technology, could pay off sooner than expected.

Reports suggest that Microsoft is developing a version of its Bing search engine that uses the artificial intelligence behind ChatGPT. The goal is to answer some search queries rather than show a list of links. ChatGPT gives human-like text answers to questions.

Microsoft has an exclusive relationship with OpenAI. In addition to the billion-dollar investment, it has a license to use the OpenAI text generator AI GPT-3 language model, described as an auto-generating text program.

OpenAI CEO Sam Altman in early December 2022 cautioned against using ChatGPT for anything important in its early stages. He tweeted that there are concerns around creating “misleading impression of greatness.” He called it a “preview of progress,” and wrote “we have lots of work to don on robustness and truthfulness.”

Perhaps the two companies may be working to create a way to reduce or eliminate bias and improve truth. Search & Performance Marketing Daily reached out to Microsoft, the company has not yet commented on this.

An older version of GPT is used for search-query suggestions as someone types. Microsoft announced plans in October for the integration of Dall-E 2 into Bing Image Creator. Brands such as Stitch Fix, Nestlé and Heinz are said to have piloted DALL-E 2 for ad campaigns and other commercial use, while some architectural firms have used DALL-E 2 and tools to conceptualize new buildings.

The rise of large language models — the branch of AI that powers ChatGPT — could reshape the search industry now dominated by Google. One report suggests that Google has been working on a similar technology. CEO Sundar Pichai has reportedly grown concerned in recent weeks, declaring ChatGPT a "code red" moment for the company.

ChatGPT can quickly deliver direct answers, but not pages of endless links. This is reportedly a concern for Google, according to Cnet, which cites The New York Times.

Part of the reason is that answers are based on human-made data available online. That means bias and misinformation can creep into a chatbot's learning model, giving answers no advertiser wants to see.

The technology could be useful, but expensive to run, according to one estimate.

“I estimate the cost of running ChatGPT is $100K per day, or $3M per month,” Tom Goldstein, associate professor at the University of Maryland, tweeted in early December. “This is a back-of-the-envelope calculation. I assume nodes are always in use with a batch size of 1. In reality they probably batch during high volume, but have GPUs sitting fallow during low volume.”

Next story loading loading..