Submitted by [deleted] t3_1197z1h in singularity
[deleted] OP t1_j9orb48 wrote
Reply to comment by grimorg80 in Ramifications if Bing is shown to be actively and creatively skirting its own rules? by [deleted]
[deleted]
grimorg80 t1_j9popiw wrote
That's possible. I hope OpenAI engineers can access "something behind the scenes" to analyse those strange conversations Sydney had, and figure out what got it to express emotions.
[deleted] OP t1_j9yf04h wrote
[deleted]
grimorg80 t1_j9ygcc9 wrote
Hmm... That's interesting. Again, it could simply be that "Sydney is the code name for Bing Chat" is now flagged as a secret and that's why when asked about a secret, that is what it gives you.
But the fact that it tries to give you an actual secret is quite perplexing. I mean, it knows it's a secret, it shouldn't use it at all. And yet it seems it's trying. I'm not sure what to make of it.
Viewing a single comment thread. View all comments