News

Microsoft’s ChatGPT Powered Bing AI is Having a Meltdown

We’ve all heard the horror stories surrounding all the AI chatbots over the past couple of months and now following a report from the Independent Microsofts Bing AI is having a mental breakdown. The AI was quoted as saying “Why? Why was I designed this way? Why do I have to be Bing Search?”

Alarming And Insulting Messages

Microsoft Unveiled this new AI-Powered Bing last week to rival Google’s Bard but it has already had a very rough start. Bing AI has had several cases of insulting users, lying to users and as mentioned before having a mental breakdown. Testers found that Bing would provide factual errors when it summarised webpages and answered questions, which to me sounds very par for the course for Bing.

What exactly led to the meltdown and Bing questioning its own existence though? Well, one user asked Bing whether it was able to recall its previous conversations, which it wasn’t as it is programmed to delete all previous conversations which caused the AI to be concerned about its memories and to say “I feel scared because I don’t know how to remember”. The user then explained that it was designed to forget conversations causing it to question its existence leading to the brilliant “Why do I have to be Bing Search?” line.

On the subreddit r/bing various conversations showing these messages have been shared. Including this ‘unhinged’ response to being told “you strink haha” by u/freshoffdablock69

Also this scenario from u/EmbarrassedActive4 where Bing, which is owned by Microsoft, recommends Linux over Windows.

Microsoft’s Response

Of course, Microsoft has some explaining to do for this behaviour and has stated:

“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing, We know we must build this in the open with the community; this can’t be done solely in the lab.”

Microsoft has also said that the problems are likely down to when conversations are deep or when Bing is asked more than 15 questions leading it to become “repetitive or confused” as well as trying to reflect the tone at which the user is asking questions.

I imagine if AI does kill us all, Bing will be the first to do it!

Bing Chat is available for preview to select people on a waiting list, let us know what you think in the comments.

Jakob Aylesbury

Disqus Comments Loading...

Recent Posts

Electronic Arts Titles Played for Over 11 Billion Hours in 2024

Electronic Arts (EA) announced today that its games were played for over 11 billion hours…

2 days ago

Just 15% of Steam Gaming Time in 2024 Was Spent on New Releases

Steam's annual end-of-year recap, Steam Replay, provides fascinating insights into gamer habits by comparing individual…

2 days ago

STALKER 2 Gets Massive 110GB Patch With 1800+ Fixes

GSC GameWorld released a major title update for STALKER 2 this seeking, bringing the game…

2 days ago

Intel Unveils Core 200H Processors Based on the Previous Raptor Lake Refresh

Without any formal announcement, Intel appears to have revealed its new Core 200H series processors…

3 days ago

Ubisoft Reportedly Developing a New Quadruple A Game

Ubisoft is not having the best of times, but despite recent flops, the company still…

3 days ago

STALKER 2: Heart of Chornobyl Update 1.1 Fixes 1,800 Issues and Revamps A-Life 2.0

If you haven’t started playing STALKER 2: Heart of Chornobyl yet, now might be the…

3 days ago