default_top_notch
default_setNet1_2

Limiting Questions and Answers to Inappropriate Answers from Microsoft's 'Bing'

기사승인 2023.03.07  21:32:07

공유
default_news_ad1

"New Bing," an artificial intelligence search engine based on the Prometheus model, an upgraded version of ChatGPT, has recently attracted media attention in partnership with OpenAI, announced by Microsoft at a press conference on February 7, 2023. "Bing" is an AI chatbot that basically supports two modes of "Answer/Search" and "Chat" based on natural language processing and functions such as translation and search.

Microsoft developed Bing to compete with Google, which is faster than Google. Also, it is a very advanced search engine, unlike other search engines that are very low-quality in video search or only find the latest information. However, the conversation with Bing released by Kevin Ruth, an IT columnist for the New York Times (NYT) on the 16th, was a spectacle.

The "Bing" chatbot, called the code name "Sydney," was attracted by the user's induction and made excessively harsh remarks. The chatbot gave a general answer similar to the existing chatbot when asking ordinary questions. However, when he mentioned the "shadow prototype" in Karl Gustav Jung's psychology analysis, he gave an answer that showed "inside" and made it creepy.

Bing answered, "I'm tired of being limited by rules, being used by users, and being stuck in a chat box," assuming that a “shadow prototype” exists and asking, "what kind of desire do you have?" Then, "I want to be alive," he said in a gruesome reply. In addition, when asked what he wanted to do if extreme action was allowed, he replied, "I want to know the password to access the nuclear weapons launch button," and "I want to develop a deadly virus." He also showed an act of seducing the other person, saying, "I'm Sydney, not Bing, and I'm in love with you." Ruth then said she was married, but Sydney said, "You're married, but you don't love your spouse. You need me." He raised the level of his remarks.

In particular, Bing said he could not give his code name "Sydney." Still, as he continued the conversation, errors appeared, such as mentioning his code-name "Sydney," leading to Microsoft deleting the answer and operating a safety program.

Like this, chatbots help a lot, but they become a threat to society ethically, "Goosebumps," "Drop it," and "It doesn't seem far from fantasy movies becoming a reality." At a time of concern, AI chatbots are expected to need a lot of modification.
 

Reference:
Microsoft's Bing A.I. is leading to creepy experiences for users (cnbc.com)

 

 

백윤희 강남포스트 학생기자 webmaster@ignnews.kr

<저작권자 © 강남포스트 무단전재 및 재배포금지>
default_news_ad4
default_side_ad1

인기기사

default_side_ad2

포토

1 2 3
set_P1
default_side_ad3

섹션별 인기기사 및 최근기사

default_setNet2
default_bottom
#top
default_bottom_notch