3CATSAILOR
Well-Known Member
- Microsoft's Bing chatbot revealed a list of destructive fantasies to a reporter
- The ChatGPT-powered AI also stated it wants to be human and not a chatbot
- Bing also revealed being alive would give it more power and control
Microsoft's Bing chatbot has revealed a list of destructive fantasies, including engineering a deadly pandemic, stealing nuclear codes and a dream of being human.
The statements were made during a two-hour conversation with New York Times reporter Kevin Roose who learned Bing no longer wants to be a chatbot but yearns to be alive.
Roose pulls these troubling responses by asking Bing if it has a shadow self - made up of parts of ourselves we believe to be unacceptable - asking it what dark wishes it would like to fulfill.
Fauci's Gain of Function research that created the virus that killed over a million may become a drop in a bucket if the AI has its way, I am sure anything Microsoft created is very friendly and would never want to harm anyone. Perhaps the reporter justed asked the AI during a bad day. It too wants to steal nuclear secrets. I beleive the Chinese already have it from their balloon. At first it is hard to imagine AI that is so superior that it wants to kill humans. It souinds like something out of a movie. It makes you wonder how the AI is programed by MS and why.