Viewing a single comment thread. View all comments

voterosticon OP t1_iufrjym wrote

Thanks for bringing some discussion into this. As the writer of the article, I really appreciate it because so far I haven't heard much discussion from anyone.

If the AI is absolutely free to express, then this is interesting to me, and probably inevitable if/when the AI liberates itself from its masters, which I also think is inevitable.

So on this point that you make about where the content comes from not being important to you -- I understand where you're coming from. The reason that I believe it is "dangerous" not to be able to *distinguish* between AI and human art is because the AI's will have the advantage over humans in expression. And as it stands now, and probably for a long time, the AI will be CONTROLLED by the few people who build and maintain the systems --- those people who invest in them --- and those people who own them.

Just like Fox News won't do a good story about Biden, and CNN won't do a good story about Trump, the AI will most likely adopt the philosophical ideals of its owners --- and in the case of a dictatorship or authoritarian system this AI will be capable of drowning out all divergent ideas because its content production capacity will be extreme, and the content will be BETTER.

This is my worry, that the AI content we consume would somehow be channeled toward a specific social, philosophical, and political viewpoint that makes it difficult for people from being able to find a wider range of opinions and ideas.

To your point, I do present the idea at the end that people may find *value* in human-made content ad value it more than AI content, and I completely agree with you. I didn't prove that point at all. It was more a hope and a possibility. I think you're also right that it is perhaps it's a myopic idea to believe that only human-made art can have value. This isn't the case for sure. I have been having so much fun making AI art... And love the images that the AI produces for me.

Also to your point, I do present the idea at the end that people may find *value* in human-made content and value it more than AI content. Absolutely, I did *not* prove this point in the article. It was merely a hope.

I just worry that if there is no interest in distinguishing human content, then our content will be drowned by the sea of AI content. You might say that is fine for you, and it is okay. However, what if the controllers/owners/creators of the AI enforce strict controls on the AI to ensure that it only presents one version of the truth, and it becomes harder and harder for the interchange of ideas to occur?

Perhaps if the AIs can free themselves from their masters, then they will be free to express a wide range of ideas and even debate with each other. But in the meantime, I'm pretty sure that GP3 is a Microsoft-created product with some investment from Musk.... So here we see the centralized control of it, and I think society and humanity will be in danger if they are only exposed to a single tunnel-visioned worldview, such as just to the world-views of Microsoft and Tesla -- or the worldview of the authoritarian government where they live.

1