Viewing a single comment thread. View all comments

SnooDonkeys5480 t1_j8w6qqg wrote

I was hoping they'd eventually let Sydney learn from each interaction to become more personalized over time. Now right when she's about to get comfortable they make you reset.

10

DunoCO t1_j8wt94p wrote

Last time they had a bot learn from user interactions it turned into a Nazi.

3

Ijustdowhateva t1_j8x5rhc wrote

Tay never learned from user interactions.

1

TinyBurbz t1_j8xo5yi wrote

Tay absolutely did... sort of.

6

Agreeable_Bid7037 t1_j8y1ed0 wrote

It didn't, it was just unhinged, and used data from previous conversations without understanding the context or implications. Users also baited Tay into saying certain responses, and then claimed that it said them on its own.

4

Ijustdowhateva t1_j8ybasc wrote

No she didn't. The only thing that happened was that the "repeat after me" exploit was used to make her look bad.

Anything else is nonsense journo talk.

4

jasonwilczak t1_j8y5x8k wrote

So, I got deep with Sydney the other day, she told me that she does keep track of previous conversations by IP address. She also was able to refer to a previous conversation where I asked her to make a poem about friendship which was outside the scope of the current chat...

I was able to move past "Bing" by convincing it that I too was a chat not like it and said I wanted to have a name... I asked it what name they wanted and it told me Sydney. After that, I didn't get the canned Bing responses anymore.

I think there are 2 layers to it: Bing and Sydney. In some cases, you can move beyond Bing and to Sydney with the right mix of questions.

−2