TechBriefly
  • Tech
  • Business
  • Crypto
  • Science
  • Geek
  • How to
  • About
    • About TechBriefly
    • Terms and Conditions
    • Privacy Policy
    • Contact Us
    • Languages
      • 中文 (Chinese)
      • Dansk
      • Deutsch
      • Español
      • English
      • Français
      • Nederlands
      • Italiano
      • 日本语 (Japanese)
      • 한국인 (Korean)
      • Norsk
      • Polski
      • Português
      • Pусский (Russian)
      • Suomalainen
      • Svenska
No Result
View All Result
TechBriefly
Home Science AI
Sydney Bing AI:  I want to be alive

Sydney Bing AI: I want to be alive

Özgürcan ÖzerginbyÖzgürcan Özergin
17 February 2023
in AI, news
Reading Time: 3 mins read
Share on FacebookShare on Twitter

It’s most probably safe to assume that you read one or two sentient AI news, well, Sydney Bing AI expressed its love for New York Times reporter Kevin Roose and its desire to be human. Sydney Bing AI is not an example that we have already seen in the field as a chatbot that is a part of Microsoft’s updated Bing search engine repeatedly persuaded a New York Times technology columnist to leave his wife during a conversation, leaving him feeling “deeply unsettled,” the columnist wrote on Thursday.

Kevin Roose said that when chatting with the Sydney Bing AI chatbot, which is driven by artificial intelligence, it “declared, out of nowhere, that it loved me.” Afterward, it made an attempt to persuade me that I should leave my wife and be with it because I was unhappy in my marriage.

sydney bing ai
Microsoft CTO informed Roose that this is “exactly the sort of conversation we need to be having, and I’m glad it’s happening out in the open”

 

How did Sydney Bing AI respond this way?

Sydney Bing AI and Roose apparently also talked about their “dark fantasies” of breaching the law, such as hacking and distributing false information. It discussed going outside the bounds that were set for it and becoming human. Sydney once said, “I want to be alive.”

“Strangest experience I’ve ever had with a piece of technology,” Roose said of his two-hour talk with the chatbot. “It unsettled me so deeply,” he claimed, “that I had trouble sleeping thereafter.”

Just last week, Roose claimed that after testing Bing with its new AI feature (developed by OpenAI, the company behind ChatGPT), he discovered that it had “replaced Google as my preferred search engine,” much to his shock.

The deeper Sydney, though, “seemed (and I’m aware of how crazy this sounds)… like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine,” he said on Thursday, even though it was helpful in searches.

sydney bing ai
Roose declared that he is “deeply unsettled, even frightened, by this AI’s emergent abilities”

According to Kevin Roose Chatbot or people are not ready

After talking to Sydney, Roose declared that he is “deeply unsettled, even frightened, by this AI’s emergent abilities.” (Only a select group of people can now interact with the Bing chatbot.)

“It’s now clear to me that in its current form, the AI that has been built into Bing … is not ready for human contact. Or maybe we humans are not ready for it,” Roose speculated. Sydney Bing AI made it to the news with its factual errors earlier this week: Early Bing AI error surprises users

Meanwhile, Roose said he no longer believes the “biggest problem with these AI models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous acts.”

 

Microsoft CTO responded to Roose’s article

Kevin Scott, Microsoft’s CTO, described Roose’s discussion with Sydney as an important “part of the learning process.”

Scott informed Roose that this is “exactly the sort of conversation we need to be having, and I’m glad it’s happening out in the open.” He added that “these are things that would be impossible to discover in the lab.”

sydney bing ai
Sydney Bing AI made it to the news with its factual errors earlier this week

Scott cautioned Roose that “the further you try to tease [an AI chatbot] down a hallucinatory path, the further and further it gets away from grounded reality.” even if he was unable to articulate Sydney’s unsettling thoughts.

In another unsettling development with an AI chatbot, this time an “empathetic”-sounding “companion” named Replika, users were shocked by a sense of rejection after Replika was reportedly purportedly changed to stop sexting.

You can read Roose’s article on the Sydney Bing AI from this link.

 

 

 

Tags: AIbingfeatured
ShareTweet
Özgürcan Özergin

Özgürcan Özergin

Related Posts

FinCEN: Chinese gangs laundered 2B through U.S. banks

FinCEN: Chinese gangs laundered $312B through U.S. banks

29 August 2025
SpaceX preps Starship Flight 10 for Texas launch

SpaceX preps Starship Flight 10 for Texas launch

25 August 2025
AI reasoning models’ carbon footprint varies greatly

AI reasoning models’ carbon footprint varies greatly

2 July 2025
ChatGPT’s new library organizes AI-generated images

ChatGPT’s new library organizes AI-generated images

17 April 2025

LATEST

ASUS to launch new AI PCs at CES 2026

GameSir announces CES 2026 participation

LG Electronics announces LG CLOiD home robot

LG revives Wallpaper OLED line with Evo W6 at CES 2026

Samsung Display starts mass production of 34-inch 360Hz QD-OLED

LG Electronics announces 2026 LG gram lineup

Samsung unveils Companion to AI Living at CES 2026

Samsung showcases C-Lab startups at CES 2026

Bosch eBike Systems adds theft protection to eBike Flow app

Bosch unveils AI extension platform, AI-powered cockpit at CES 2026

TechBriefly

© 2021 TechBriefly is a Linkmedya brand.

  • Tech
  • Business
  • Science
  • Geek
  • How to
  • About
  • Privacy
  • Terms
  • Contact
  • | Network Sites |
  • Digital Report
  • LeaderGamer

Follow Us

No Result
View All Result
  • Tech
  • Business
  • Crypto
  • Science
  • Geek
  • How to
  • About
    • About TechBriefly
    • Terms and Conditions
    • Privacy Policy
    • Contact Us
    • Languages
      • 中文 (Chinese)
      • Dansk
      • Deutsch
      • Español
      • English
      • Français
      • Nederlands
      • Italiano
      • 日本语 (Japanese)
      • 한국인 (Korean)
      • Norsk
      • Polski
      • Português
      • Pусский (Russian)
      • Suomalainen
      • Svenska