Viewing a single comment thread. View all comments

a4mula OP t1_j1agrnj wrote

No, I'd agree there certainly seems to be clear patterns in its outputs. It'll be interesting to see if users begin to mimic these styles.

I already know the answer for me, because I can see the clear shifts in my own.

1

Ok_Garden_1877 t1_j1anbah wrote

Ah, okay. So you're saying the more we use it, the more we will become like it? Like saying "art is not a reflection of society, but society is a reflection of art"?

2

a4mula OP t1_j1anxky wrote

Again, I'm not an expert. I'm a user with very limited exposure in the grand scheme. But what I see happening goes something like this.

The machine acts as a type of echo chamber. It's not bias, it's not going to develop any strategies that could seen as harmful.

But it's goal is to process the requests of user input.

And it's very good at that. Freakishly good. Super Human good. And any goal that user has, regardless of the ethics, or morality, or merit, or cost to society.

That machine will do it's best to accomplish the goal of assisting a user in accomplishing it.

In my particular interactions with the machine, I'd often prompt it to subtly encourage me to remember facts. To think more critically. To shave bias and opinion out of my language because it creates ambiguity and hinders my interaction with the machine.

And it had no problem providing all of those abstracts to me through the use of its outputs.

The machine amplifies what we bring to it. Good or Bad.

2