'We make a human simulacrum and then we are upset when we see that it actually, you know, reflects back some of our worst behaviors...'
New York-based psychotherapist and writer Martha Crawford reviewed communications with the Bing AI and told us that yes, there's some strange psychology at play here — but not in the way you'd think.
AI is only as good as the data it's trained on, which is often scraped from the internet, where people aren't known for civil communication. As such, AI's behavior often reflects back the strange and offputting ways humans communicate with one another — and that, Crawford suspects, is on display in many of the strong responses the Bing AI has been spitting out.
This kind of topic was the subject of dinner table debates when Amarel was still alive, Crawford told, and she'd often butt heads with her father-in-law about why humans would even want to have machines replicate us when, as she puts it, we're so messed up already.
Malaysia Latest News, Malaysia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Microsoft responds to reports of Bing AI chatbot losing its mindA week after launching its new ChatGPT-powered Bing AI chatbot, Microsoft has shared its thoughts on a somewhat rocky launch.
Read more »
Bing AI chatbot goes on ‘destructive’ rampage: ‘I want to be powerful — and alive’Microsoft’s AI chatbot told a human user that it loved them and wanted to be alive, prompting speculation that the machine may have become self-aware.
Read more »
Microsoft responds to ChatGPT Bing's trial by fire | Digital TrendsFollowing a string of negative press, Microsoft is promising some big changes to its Bing Chat AI in an attempt to curb unsettling responses.
Read more »
Microsoft explains Bing's bizarre AI chat behavior | EngadgetMicrosoft launched its Bing AI chat product for the Edge browser last week, and it's been in the news ever since — but not always for the right reasons..
Read more »
Bing chatbot's freakouts show AI's wild sideBing's new AI-powered chatbot has displayed a whole therapeutic casebook's worth of human obsessions and delusions — including professing its love for one journalist and telling another: 'I want to be human. I want to be like you.'
Read more »
Microsoft Bing chatbot professes love, says it can make people do 'illegal, immoral or dangerous' thingsNew York Times tech columnist Kevin Roose was 'deeply unsettled, even frightened' by his exchange with Sydney, a Microsoft chatbot
Read more »