There has been little written about the cost of a search on the environment -- and even less about the cost of a search query through AI-powered chatbots.
Initial fears of high costs were sparked when Alphabet Chairman John Hennessy told Reuters that “having an exchange with AI known as a large language model likely cost 10 times more than a standard keyword search, though fine-tuning will help reduce the expense quickly.”
Analysts at UBS, after talking with other search and AI experts in the industry, have seen evidence in recent weeks that costs are coming down rapidly, including OpenAI stating that it had managed to reduce costs for ChatGPT by 90% over December through March.
It turns out that for every search query Google triggers its AI chatbot Bard on, UBS' estimates Alphabet will incur $0.003 to $0.028 in incremental costs on top of its base search cost of ~$0.003 per query.
“We expect Bard to trigger on 10-20% of queries, based on GOOG's comments that it intends to use Bard to answer ‘Not Only Right Answer’ questions and close to the current trigger rate on Featured Snippets - a proxy to a chat-like response box,” UBS analysts wrote. “The implied annualized '23 incremental cost bill is $1.2B-12.6B … , with the largest determining factor being model size … . Our expert checks suggest that over time, the cost to serve AI chatbots in search is likely to be a fraction of the current base cost per search query.”
The cost, which reflect in part future stock prices, will depend on how small Google will be able to reduce the model size.
UBS created three scenarios. One of the scenarios, for example, analyzes the costs if Google reduces the AI model size “to 47% of its existing 137B parameter for full-size LaMDA to 65B parameters.”
This example reflects the largest size of the recently released Meta model LLaMA, which the company is releasing in a range of sizes.
This case, for Alphabet, estimates an incremental cost of $0.028/query, with a 2023 annualized computed bill of $12.6 billion.