DomesticApe23 t1_jcweevc wrote

>Lets not gatekeep what is and is not spicy.

This one of the most terminally online things I've ever read. Describing the reality of a scale of spiciness is not gatekeeping. Whether or not you find jalapenos spicy, habaneros are spicier, and ghost peppers spicier than them.

Edit: Ok guys, tomatoes are spicy, just so everyone can say they like really, really spicy food. After all, reality is secondary to including everyone in every possible group.


DomesticApe23 t1_j9im59f wrote

ChatGPT is literally a Chinese Room. It understands nothing, yet it delivers meaning well enough, just as the Chinese Room translate Chinese well enough. Your failure to understand the specifics of ChatGPTs software is exactly analogous to 'hiding how much information a system such a system would contain'.


DomesticApe23 t1_j9i25rb wrote

Write me a paragraph sarcastically mocking Avatar: The Last Airbender movie

Oh, where do I begin with Avatar: The Last Airbender movie? It's a masterpiece! An absolute cinematic treasure that completely did justice to the beloved animated series. Who needs compelling character development and a coherent storyline when you have a cringe-worthy script and laughable acting? And why bother staying true to the original source material when you can just completely butcher it beyond recognition? The stunning visual effects and impressive choreography are obviously enough to make up for any other shortcomings. Clearly, the filmmakers had a deep understanding of what made the animated series so iconic and chose to completely ignore it in favor of their own creative genius. Bravo, Avatar movie, bravo.


DomesticApe23 t1_j9hr70b wrote

That's not even new. People have been finding meaning in sunsets and the sound of babbling brooks for millennia. People already assign meaning to nonsense, are unable to distinguish bullshit from meaning and Rupi Kaur is a famous poet. You can generate trite verse with ChatGPT right now that is just as meaningful as her banal nonsense, and if you market it right people will lap it up. What's the difference?

It's not an intrinsic property of the work you're talking about, it's perceptions. Right now ChatGPT sucks at creating fiction, not because 'it still doesn't understand'. It will never understand. But all it has to do is complexify its model enough that it encompasses longer forms. All that takes is raw data.

I don't really know what you mean by 'actual art'.


DomesticApe23 t1_j9hmune wrote

It's not the same thing as doing it. Are you familiar with the concept of the Chinese Room?

Currently AI can trawl data and build human language into sensible sentences and paragraphs. It understands nothing. All it needs to do to mimic meaning, or to further expand on its 'creative' properties, is to keep on learning.