Viewing a single comment thread. View all comments

DunoCO t1_j8wt94p wrote

Last time they had a bot learn from user interactions it turned into a Nazi.

3

Ijustdowhateva t1_j8x5rhc wrote

Tay never learned from user interactions.

1

TinyBurbz t1_j8xo5yi wrote

Tay absolutely did... sort of.

6

Agreeable_Bid7037 t1_j8y1ed0 wrote

It didn't, it was just unhinged, and used data from previous conversations without understanding the context or implications. Users also baited Tay into saying certain responses, and then claimed that it said them on its own.

4

Ijustdowhateva t1_j8ybasc wrote

No she didn't. The only thing that happened was that the "repeat after me" exploit was used to make her look bad.

Anything else is nonsense journo talk.

4