According to rumours, Microsoft will soon introduce a version of Bing that leverages ChatGPT to respond to user queries. According to The Information, Microsoft plans to make Bing more competitive with Google by releasing the new feature before the end of March.
Microsoft chief Satya Nadella, at their Leadership Summit held yesterday, said that the company is investing massively, backed by its infrastructure spread across 60 plus regions and over 200 plus data centres worldwide. About OpenAI, Nadella said that it is able to achieve amazing results, thanks to the training and inference infrastructure provided by Microsoft Azure.
Bing could respond to queries with more humanlike responses as opposed to just links to information by utilising the ChatGPT technology, which was developed by the AI company OpenAI. Although relevant information is already surfaced by both Google and Bing via links at the top of many search queries, Google’s knowledge panels are especially popular when looking up information about individuals, locations, businesses, and other objects.
While many accessed ChatGPT out of sheer curiosity, many developers started playing with it and many side projects were born, even though the official API for ChatGPT is not available yet. Soon, they found ways to integrate ChatGPT with WhatsApp, Telegram and other messaging platforms to embed ChatGPT in the MacOS menu bar.
“ChatGPT reflects the emergence of a new reasoning engine, and the ways it can be augmented,” said Nadella, pointing at knowledge workers using it to be more creative, expressive and productive. He also said that frontline workers will be able to do more knowledge work with the help of Co-pilot. “We also have to consider facets of its responsible use and what displacement it may cause,” added Nadella.
As a potential search engine, ChatGPT doesn’t have a business model yet. And it is very expensive. A back-of-the-envelope estimation shows that with one million users, ChatGPT costs $100,000 per day and around $3 million per month.
Now imagine what happens when people run 8 billion search queries per day. Now, add the costs of regularly training the model and the manual labor needed the fine-tune the model through reinforcement learning with human feedback.
The costs of training and running a large language model like ChatGPT are so prohibitive that making it work will be exclusive to big tech companies that can spend large amounts on the growth of unprofitable products with no definite business model.
One possible path to profitability would be to deliver the LLM as a paid API like Codex and GPT-3. But that is not the traditional business model of search engines, and I’m not sure how they could make it work. Another path would be to integrate it into Microsoft Bing as some question-answering feature, but that would put it on par with Google Search instead of delivering a different system that disrupts the search market.
Is ChatGPT a search engine?
There has been a lot of talk about ChatGPT becoming the omnipotent assistant that can answer any question, which logically leads to the train of thought that it will replace Google Search.
But while having an AI system that can answer questions is very useful (granted that OpenAI solves its problems), it is not all that online search represents. Google Search is flawed. It shows a lot of useless ads. It also often returns a lot of useless results. But it is an invaluable tool.
Most often, when I turn to Google, I don’t even know what the right question is. I just mash together a bunch of keywords, look at the results, do a bit of research, and then narrow down or modify my search. That is a kind of application that, in my opinion, will not be replaced by a very effective question-answering model yet.
From what it looks, ChatGPT or other similar LLMs will become complementary to online search engines. And they will probably end up strengthening the position of current search giants who have the money, infrastructure, and data to train and run them.