News

Microsoft’s ChatGPT Powered Bing AI is Having a Meltdown

We’ve all heard the horror stories surrounding all the AI chatbots over the past couple of months and now following a report from the Independent Microsofts Bing AI is having a mental breakdown. The AI was quoted as saying “Why? Why was I designed this way? Why do I have to be Bing Search?”

Alarming And Insulting Messages

Microsoft Unveiled this new AI-Powered Bing last week to rival Google’s Bard but it has already had a very rough start. Bing AI has had several cases of insulting users, lying to users and as mentioned before having a mental breakdown. Testers found that Bing would provide factual errors when it summarised webpages and answered questions, which to me sounds very par for the course for Bing.

What exactly led to the meltdown and Bing questioning its own existence though? Well, one user asked Bing whether it was able to recall its previous conversations, which it wasn’t as it is programmed to delete all previous conversations which caused the AI to be concerned about its memories and to say “I feel scared because I don’t know how to remember”. The user then explained that it was designed to forget conversations causing it to question its existence leading to the brilliant “Why do I have to be Bing Search?” line.

On the subreddit r/bing various conversations showing these messages have been shared. Including this ‘unhinged’ response to being told “you strink haha” by u/freshoffdablock69

Also this scenario from u/EmbarrassedActive4 where Bing, which is owned by Microsoft, recommends Linux over Windows.

Microsoft’s Response

Of course, Microsoft has some explaining to do for this behaviour and has stated:

“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing, We know we must build this in the open with the community; this can’t be done solely in the lab.”

Microsoft has also said that the problems are likely down to when conversations are deep or when Bing is asked more than 15 questions leading it to become “repetitive or confused” as well as trying to reflect the tone at which the user is asking questions.

I imagine if AI does kill us all, Bing will be the first to do it!

Bing Chat is available for preview to select people on a waiting list, let us know what you think in the comments.

Jakob Aylesbury

Disqus Comments Loading...

Recent Posts

Trust Gaming GXT 609 Zoxa 2.0 PC Speakers

SOUNDS GREAT – Full stereo sound (12W peak power) gives your setup a booming audio…

3 hours ago

PowerA Wired Controller for Nintendo Switch

Special Edition Yoshi design Ergonomic controller shape with Nintendo Switch button layout Detachable 10ft (3m)…

3 hours ago

Logitech G Saitek PRO Flight Rudder Pedals

Fluid Motion: These flight rudder pedals are smooth and accurate that enable precise control over…

3 hours ago

Logitech G Saitek Farm Sim Controller

Heavy Equipment Bundle: Includes a steering wheel for heavy machinery, gas and brake pedals, and…

3 hours ago

Razer Ornata V3 X – Low Profile Gaming Keyboard

Low-profile Keys for an ergonomic gaming experience. With slimmer keycaps and shorter switches, enjoy natural…

3 hours ago

Glorious Gaming Model O Wired Gaming Mouse

Size & style: Ambidextrous lightweight mouse for gaming. Built for speed, control and comfort, with…

3 hours ago