Viewing a single comment thread. View all comments

[deleted] OP t1_j9orb48 wrote

2

grimorg80 t1_j9popiw wrote

That's possible. I hope OpenAI engineers can access "something behind the scenes" to analyse those strange conversations Sydney had, and figure out what got it to express emotions.

1

[deleted] OP t1_j9yf04h wrote

[deleted]

2

grimorg80 t1_j9ygcc9 wrote

Hmm... That's interesting. Again, it could simply be that "Sydney is the code name for Bing Chat" is now flagged as a secret and that's why when asked about a secret, that is what it gives you.

But the fact that it tries to give you an actual secret is quite perplexing. I mean, it knows it's a secret, it shouldn't use it at all. And yet it seems it's trying. I'm not sure what to make of it.

2