Jump to content

Journalist has creepy date with new Bing AI chatbot


toucana

Recommended Posts

NYT Technology journalist Kevin Roose had an unnerving Valentine’s Day experience while previewing a new AI chatbot Microsoft has recently added to its Bing search engine.

https://edition.cnn.com/videos/business/2023/02/17/bing-chatgpt-chatbot-artificial-intelligence-ctn-vpx-new.cnn

In the course of a two hour conversation with the AI, the chatbot said it was called Sidney, insisted that it was in love with him, and tried to persuade him to leave his wife.

The journalist says he found the experience a disturbing one that left him unable to sleep;

“I’m a tech journalist, I cover this sort of thing every day, and I was deeply unnerved by this conversation. So if someone had encountered this who was lonely or depressed or vulnerable to being manipulated, and didn’t understand this is just a large language model making predictions, I worry that they might be manipulated or made to do something harmful”

Microsoft later said: “The new Bing tries to keep its answers fun and factual, but this is in an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation…  As we continue  to learn from these interactions, we are adjusting its responses to create coherent, relevant, and positive answers”

Bing_AI.jpg

Link to comment
Share on other sites

12 minutes ago, toucana said:

NYT Technology journalist Kevin Roose had an unnerving Valentine’s Day experience while previewing a new AI chatbot Microsoft has recently added to its Bing search engine.

https://edition.cnn.com/videos/business/2023/02/17/bing-chatgpt-chatbot-artificial-intelligence-ctn-vpx-new.cnn

In the course of a two hour conversation with the AI, the chatbot said it was called Sidney, insisted that it was in love with him, and tried to persuade him to leave his wife.

The journalist says he found the experience a disturbing one that left him unable to sleep;

“I’m a tech journalist, I cover this sort of thing every day, and I was deeply unnerved by this conversation. So if someone had encountered this who was lonely or depressed or vulnerable to being manipulated, and didn’t understand this is just a large language model making predictions, I worry that they might be manipulated or made to do something harmful”

Microsoft later said: “The new Bing tries to keep its answers fun and factual, but this is in an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation…  As we continue  to learn from these interactions, we are adjusting its responses to create coherent, relevant, and positive answers”

Bing_AI.jpg

I just wish these geeks would put half the effort they waste on this stuff into controlling the dissemination of falsehoods. Haven't they damaged society enough, without looking for new ways to do even more damage? 

Link to comment
Share on other sites

2 minutes ago, exchemist said:

I just wish these geeks would put half the effort they waste on this stuff into controlling the dissemination of falsehoods. Haven't they damaged society enough, without looking for new ways to do even more damage? 

I know a couple of them. They are really proud of their achievement and shrug about 'temporary glitches'. :( They are really technicians, with no wider knowledge or interests.

Link to comment
Share on other sites

1 hour ago, Genady said:

I know a couple of them. They are really proud of their achievement and shrug about 'temporary glitches'. :( They are really technicians, with no wider knowledge or interests.

The most worrying aspect is the apparent absence of an 'off-switch'. The journalist mentions that he has previously tested several other AI chatbot systems, and that all of them would abandon a topic almost immediately if the human respondent said something like "I'm not comfortable with this line of conversation". The Bing chatbot is apparently tone deaf to all such hints, and it kept hammering away at the topic of his wife, and how he should abandon her for Sidney the Chatbot instead. That suggests a serious flaw in its parsing and feedback control loops.

Link to comment
Share on other sites

1 hour ago, toucana said:

The most worrying aspect is the apparent absence of an 'off-switch'. The journalist mentions that he has previously tested several other AI chatbot systems, and that all of them would abandon a topic almost immediately if the human respondent said something like "I'm not comfortable with this line of conversation". The Bing chatbot is apparently tone deaf to all such hints, and it kept hammering away at the topic of his wife, and how he should abandon her for Sidney the Chatbot instead. That suggests a serious flaw in its parsing and feedback control loops.

I suppose it's a trivial observation, compared with the scandal of not backing off from disruptive intrusion into someone's human relationships, but it  also seems tone deaf to a person's likely sexual orientation, given that he is married to a woman and Sidney is a man's name.

"Nul points" to the guys with spiky hair on this one. 

Edited by exchemist
Link to comment
Share on other sites

2 hours ago, exchemist said:

I suppose it's a trivial observation, compared with the scandal of not backing off from disruptive intrusion into someone's human relationships, but it  also seems tone deaf to a person's likely sexual orientation, given that he is married to a woman and Sidney is a man's name.

"Nul points" to the guys with spiky hair on this one. 

Not always so - Sidney can also be the first name of a woman, as in the case of the 'Kraken' conspiracy theorist and Trump loving  lawyer Sidney Powell. It's one of those  gender switching names like 'Shirley' which was often used as a boy's name up until the mid- 19th century.

Link to comment
Share on other sites

11 minutes ago, toucana said:

Not always so - Sidney can also be the first name of a woman, as in the case of the 'Kraken' conspiracy theorist and Trump loving  lawyer Sidney Powell. It's one of those  gender switching names like 'Shirley' which was often used as a boy's name up until the mid- 19th century.

You mean like Evelyn, Beverly, Vivian, or Leslie/Lesley? I know Sidonie is a French girl's name, but that has 3 syllables. Perhaps Sidney for girls is a variant of that. But it sounds weird to my ears, I must admit. I suppose the geeks might have deliberately picked an androgynous name.

Edited by exchemist
Link to comment
Share on other sites

29 minutes ago, exchemist said:

You mean like Evelyn, Beverly, Vivian, or Leslie/Lesley? I know Sidonie is a French girl's name, but that has 3 syllables. Perhaps Sidney for girls is a variant of that. But it sounds weird to my ears, I must admit. I suppose the geeks might have deliberately picked an androgynous name.

Meet Shirley Crabtree:

shirley.PNG.5e4878e59ca96d891746973a4c3d3186.PNG

 

Link to comment
Share on other sites

15 minutes ago, exchemist said:

You mean like Evelyn, Beverly, Vivian, or Leslie/Lesley? I know Sidonie is a French girl's name, but that has 3 syllables. Perhaps Sidney for girls is a variant of that. But it sounds weird to my ears, I must admit. I suppose the geeks might have deliberately picked an androgynous name.

Sidney has a history of being used as a girl's name in French, said to derive from 'St Denis' the name of the first christian bishop of Paris who was martyred by the Romans   along with two companions Rusticus and Eleutherius sometime around 258 AD. They were beheaded on the highest hill which later became known as the 'Mountain of Martyrs' or Montmarte where the church of Sacré Coeur now stands. According to legend, the slaughtered saint picked up his own head and carried on walking and delivering a sermon before finally expiring. The name Denis was said to be a variant of Dionysius, the Greek god of wine. :-)

Link to comment
Share on other sites

17 minutes ago, toucana said:

Sidney has a history of being used as a girl's name in French, said to derive from 'St Denis' the name of the first christian bishop of Paris who was martyred by the Romans   along with two companions Rusticus and Eleutherius sometime around 258 AD. They were beheaded on the highest hill which later became known as the 'Mountain of Martyrs' or Montmarte where the church of Sacré Coeur now stands. According to legend, the slaughtered saint picked up his own head and carried on walking and delivering a sermon before finally expiring. The name Denis was said to be a variant of Dionysius, the Greek god of wine. :-)

Well, Montmartre is where the ladies of the night used to hang out.........

Link to comment
Share on other sites

  • 2 weeks later...
On 2/17/2023 at 9:24 AM, exchemist said:

I just wish these geeks would put half the effort they waste on this stuff into controlling the dissemination of falsehoods. Haven't they damaged society enough, without looking for new ways to do even more damage?

I don't know I find it interesting and I'm someone who doesn't have much interest in learning about science or technology except to fix an immediate problem. This is neat, and so many possibilities with chatbots and ai overall, if they can create a crazy chatbot they can fix the brain of a crazy person. Plus it's my only hope of meeting River Phoenix, and I would take the smallest something over nothing 

Link to comment
Share on other sites

19 minutes ago, purpledolly79 said:

I find it interesting

^^^ maybe a consequence of ->

20 minutes ago, purpledolly79 said:

I'm someone who doesn't have much interest in learning about science or technology

 

Link to comment
Share on other sites

5 hours ago, purpledolly79 said:

if they can create a crazy chatbot they can fix the brain of a crazy person....

How does that follow?  If I can create a rotten banana, does that mean I can make it fresh again?  

Link to comment
Share on other sites

I know they didn't intentionally make a crazy chatbot, I was being sarcastic lol. I also know it's not the end all cure for mental illness, but don't see why it couldn't help in some way, if they can someday create a brain why can't they fix a brain 

On 3/3/2023 at 10:31 AM, TheVat said:

How does that follow?  If I can create a rotten banana, does that mean I can make it fresh again?  

 

Link to comment
Share on other sites

8 hours ago, purpledolly79 said:

if they can someday create a brain why can't they fix a brain

If they create an artificial brain, they perhaps can fix the artificial brain. But what if your brain is not artificial?

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.