Worldwide desktop and mobile web traffic to the ChatGPT website, chat.openai.com, fell in June by 9.7% from
May, according to preliminary Similarweb data estimates. In the U.S., the month-over-month decline was 10.3%.
The amount of time visitors spent on the website fell 8.5%, but ChatGPT still attracted more worldwide visitors than bing.com, Microsoft’s search engine, or Character.AI, the second-most popular stand-alone AI chatbot site, wrote David Carr, senior insights manager at Similarweb.
"Whether OpenAI management is brokenhearted about the dip in traffic is debatable," Carr wrote. "Initially launched as a technology demo, the ChatGPT website primarily serves as a loss leader generating sales leads for OpenAI, which makes its technology available for other companies to embed in their applications."
Microsoft, as Carr points out, is a major backer and has embedded OpenAI’s GPT-4 algorithm into the Bing chat service offered as part of the search engine.
Unlike Bing, OpenAI is not necessarily set up to run a mass-market, ad-supported website.
ChatGPT's direct revenue comes from subscriptions sold to companies wanting access to the latest version.
Subscribers get GPT-4, for example, while free users get an older version. The sustainability of that as a business model is questionable, given that people can get GPT-4 for free as part of Bing, according to Carr.Worldwide visits to Character.AI declined 32% month-over-month, although traffic is still up from June 2022, when the company founded by former Google engineers was just getting started.
Character.AI, in a "playful" way, takes on the personalities of celebrities, historical figures, and fictional characters.
The website began to grow much more rapidly after ChatGPT attracted more attention to AI chatbots, but its traffic volume also fell in June.
ChatGPT has not been without challenges, from a loophole that allowed users to access to publisher content behind paywalls without paying, to lawsuits on using copywritten content without permission to train its models.
Then there are the energy-intensive characteristics that require massive server farms to provide enough data to train the powerful models. Cooling the servers in data centers requires lots of water.
Research suggests training for GPT-3 consumed 185,000 gallons of water. An average user’s conversational exchange with ChatGPT basically amounts to dumping a large bottle of fresh water out on the ground, according to a new study from researchers at the University of California Riverside and the University of Texas at Arlington.
The artificial intelligence (AI) water-consumption estimates for ChatGPT are in a paper titled Making AI Less Thirsty.
The authors found the amount of clear freshwater required to train GPT-3 is equivalent to the amount needed to fill a nuclear reactor’s cooling tower. The survey first reported by Gizmodo in May.