Technology

Microsoft Limits Bing

We have seen a lot of “AI Gone Wrong” fiction over the years. In none of it did we guess the robots would start catfishing us.
No items found.

5 Minutes of Fresh Perspective

Reading the daily news doesn't have to suck. Get the email that will make you laugh and keep you informed...for free!
Microsoft Limits Bing

Apparently, Bing AI cannot be trusted

Microsoft has set a limit for the number of questions a user can ask the chatbot, following several instances of it going weird places with its replies. 

The limit means a user can only ask five questions per session and has a 50 question per-day limit. 

What do you mean by "weird?" Well, the chatbot has been insulting people and lying, but what is more troubling is the AI's use of emotional manipulation. One New York Times story covered a two-hour chat session where Bing professed love for its chat partner. Creepy stuff.

Check out more!

Get the daily email that makes reading the news actually enjoyable. Stay informed and entertained, for free.