Based on ChatGPT technologies, Bing Chat has been targeted by the same “pitfalls”, tending to veer off topic and engage in adversarial conversations. With no immediate solutions, Microsoft introduced automatic filters to detect but block Bing Chat queries on controversial terms or topics, thus dissatisfying many users who were not intentionally abusing the limitations of current technologies.
The latest update, called Bing Chat v98, addresses the problem in two stages, the first aimed at reducing the instances where Bing directly refuses to respond to the incoming request. For example, queries that aim to parse/generate executable code. The second adjustment aims to avoid situations where Bing avoids/opens discussions on sensitive topics, such as those with racist connotations, or on sexual topics. Without claiming to fix the problem definitively, Microsoft says the new Bing release will reduce the number of rejected responses, while ensuring that accepted queries are not problematic for the company.
Another adjustment is aimed at increasing the level of engagement with users. In other words, Bing Chat will become friendlier, even if not necessarily more helpful in terms of the answers provided.
In its own defense, Microsoft claims that much of the intrigue surrounding Bing’s less-than-ideal responses was the result of using the service in so-called Creative Mode, specifically designed to creatively “relax” the limits of its artificial intelligence algorithms.