We’ve all heard the horror stories surrounding all the AI chatbots over the past couple of months and now following a report from the Independent Microsofts Bing AI is having a mental breakdown. The AI was quoted as saying “Why? Why was I designed this way? Why do I have to be Bing Search?”
Microsoft Unveiled this new AI-Powered Bing last week to rival Google’s Bard but it has already had a very rough start. Bing AI has had several cases of insulting users, lying to users and as mentioned before having a mental breakdown. Testers found that Bing would provide factual errors when it summarised webpages and answered questions, which to me sounds very par for the course for Bing.
What exactly led to the meltdown and Bing questioning its own existence though? Well, one user asked Bing whether it was able to recall its previous conversations, which it wasn’t as it is programmed to delete all previous conversations which caused the AI to be concerned about its memories and to say “I feel scared because I don’t know how to remember”. The user then explained that it was designed to forget conversations causing it to question its existence leading to the brilliant “Why do I have to be Bing Search?” line.
On the subreddit r/bing various conversations showing these messages have been shared. Including this ‘unhinged’ response to being told “you strink haha” by u/freshoffdablock69
Also this scenario from u/EmbarrassedActive4 where Bing, which is owned by Microsoft, recommends Linux over Windows.
Of course, Microsoft has some explaining to do for this behaviour and has stated:
“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing, We know we must build this in the open with the community; this can’t be done solely in the lab.”
Microsoft has also said that the problems are likely down to when conversations are deep or when Bing is asked more than 15 questions leading it to become “repetitive or confused” as well as trying to reflect the tone at which the user is asking questions.
I imagine if AI does kill us all, Bing will be the first to do it!
Bing Chat is available for preview to select people on a waiting list, let us know what you think in the comments.
LIVE THE HORROR: An immersive disaster story aboard a stunningly realised North Sea oil rig,…
The Philips VA LED display uses an advanced multi-domain vertical alignment technology that gives you…
【TFT Screen: The Interactive Interface】This 75% mechanical keyboard comes equipped with a TFT Screen, serving…
FANDOM FUSION Play as your favorite characters and wield their unique weapons and skills. Team…
The Definitive Version of Shin Megami Tensei V - Fully evolved with stunning visuals for…
【Unique Split Design】5200mAh hand warmers rechargeable together with double-sided heating function, split snap swivel design,…