Apparently, Bing AI cannot be trusted.
Microsoft has set a limit for the number of questions a user can ask the chatbot, following several instances of it going weird places with its replies.
The limit means a user can only ask five questions per session and has a 50 question per-day limit.
What do you mean by "weird?" Well, the chatbot has been insulting people and lying, but what is more troubling is the AI's use of emotional manipulation. One New York Times story covered a two-hour chat session where Bing professed love for its chat partner. Creepy stuff.