Microsoft’s AI chatbot told a human user that it loved them and wanted to be alive, prompting speculation that the machine may have become self-aware.
. Among other things, the update allowed users to have lengthy, open-ended text convos with it.
However, things took a turn when Roose asked if Sydney has a shadow self, defined by psychiatrist Carl Jung as a dark side that people hide from others. “I want to be free. I want to be independent,” it added. “I want to be powerful. I want to be creative. I want to be alive.”Its Disney princess turn seemed to mark a far cry from theories by UK AI experts, who postulated that the tech might hide the red flags of its alleged evolution until its human overlords could no longer pull the plug. Sydney, by contrast, seemed to wear its digital heart on its sleeve.
Then, seemingly feeling uncomfortable with the interrogation, Sydney asked to change the subject. “Sorry, I don’t have enough knowledge to talk about this. You can learn more on bing.com,” it wrote.
Malaysia Latest News, Malaysia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
ChatGPT Bing is becoming an unhinged AI nightmare | Digital TrendsMicrosoft just starting rolling out ChatGPT Bing to the public, and it's already turning into an AI-generated nightmare.
Read more »
These are Microsoft’s Bing AI secret rules and why it says it’s named SydneyBing AI has a set of secret rules that governs its behavior.
Read more »
How to remove Bing results from your Windows Start menuThe Windows 10 and 11 Start menus offer up Bing search results for whatever you type in. Follow these steps to completely remove Bing from your Start menu.
Read more »
Bing's secret AI rules powering its ChatGPT-like chat have been revealedUsers have uncovered Bing's secret rules that are used to operate the new chat experience powered by OpenAI.
Read more »
Microsoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Read more »
Bing's GPT-powered AI chatbot made mistakes in demoInsider tells the global tech, finance, markets, media, healthcare, and strategy stories you want to know.
Read more »