Microsoft has rolled out a new version of its Bing chatbot (version 96) featuring improvements to make its AI smarter in several key areas – and a big change has also been marked as imminent.
Mikhail Parakhin, who heads the advertising and network services department at Microsoft, shared this information on Twitter (via Advanced MS user (opens in a new tab)).
OK, this took longer than we initially expected, but finally Bing Chat v96 is fully in production. try it! Now, for the full shipment of the three-switch…February 28, 2023
What’s new in v96? Parakhin explains that users of ChatGPT-based Bing will now see a “significant” reduction in the number of times the AI simply refuses to answer a query.
Apparently, there will also be “reduced instances of hallucinations in responses”, which is industry jargon, meaning the chatbot will generate fewer errors and inaccuracies when responding to users. In short, a chatbot should convey less misinformation, and there have been some worrying cases recently.
The other big news Parakhin delivers is that the so-called tri-toggle, known more formally as the Bing chat mode selector – featuring three settings for switching between different Bing AI personalities – is set to launch in the “next few days” we’re he said (opens in a new tab).
Analysis: You have a long and winding road ahead of you
The ability to switch between three personalities is a big change for the Bing chatbot, and hearing that it’s imminent is an exciting thing for those who have been involved in AI so far.
As described earlier, the three available personalities are labeled Accurate, Balanced, and Creative. The latter is intended to provide a more chatty experience, while Precise will offer shorter, more typical “search results” delivery, with Balanced being the middle way between the two. So, if you don’t like how the AI reacts to you, at least there will be ways to change its behavior.
As you can imagine, different versions of the chat mode selector have been tested and the final model has just been chosen. This is now being fine-tuned ahead of launch, which should be later this week as mentioned, but we’re guessing there will be a lot of fine-tuning after launch.
Certainly if the overall Bing AI experience was anything to be expected, as the whole project is obviously still in its early stages, with Microsoft cutting and changing things – sometimes in huge ways – seemingly without much care.
The current tuning to version 96 to ensure that Bing doesn’t get confused and just doesn’t respond will help make the AI a more enjoyable virtual entity to interact with, and hopefully the same goes for the ability to change personality.
At the very least, the creative persona should bring back some much-needed character to the chatbot, which many people want – because if the AI behaves almost like a search engine, the project feels a bit dry and frankly in danger of being considered nonsensical. After all, the goal of this initiative is to make Bing something other than just traditional search.
No doubt it’s going to be a long road of refining Bing’s AI, and the next step after launching Personality will likely be to raise that chat limit (which was imposed shortly after launch) to something a bit higher to allow for more prolonged conversations. If not the full vagabonds we initially witnessed, the ones that put the chatbot in hot water because of the quirks it produced…