Viewing a single comment thread. View all comments

TheRoadsMustRoll t1_j9w9fdr wrote

>This article is kind of nonsense.

yep. here's some highlights:

>AI Creativity is Real

despite the authors wordy arguments AI requires input and only that input (however jumbled) will be returned on a query. if it were really creative it could dream up something on its own and populate a blank sheet of paper with something novel. AI isn't creative. the people that program it might be.


>3. Comparison: Human Brains vs. AI

despite the title of this section the author never actually makes any comparison. we only get this:

>The present analysis posits that the human brain, in terms of artistic creation, is lacking in two conditions that AI is capable of fulfilling.
>AI decomposes high-dimensional data into lower-dimensional features, known as latent space. [AI is more compact]
>AI can process massive amounts of data in a short time, enabling efficient learning and creation of new data. [AI is more comprehensive]

ftr: the human brain processes a massive amount of data and succeeds in keeping living beings alive while driving/painting/writing code. the list of things human brains can do that AI can't is very long.


wastedmytwenties t1_j9wkp01 wrote

I'm not trying to be funny here, but I genuinely think this has been written by an AI. Play around with chatgpt and it has the same exact tone.


MoleyWhammoth t1_j9wr2e1 wrote

My thoughts exactly.

Did we just pass the Turing Test?


ibringthehotpockets t1_j9xs6bu wrote

It’s easy to be biased towards detecting that it’s an AI if you read the comments here first. There was very little in the article that made me think “nope can’t be human” - it’s a post on a Wordpress blog. I wouldn’t really hold that to NYtimes level of writing. The thing that stood out most was the jumping around topics from like Oppenheimer and Nietzsche. But still, that to me is just like a high schoolers essay lol.

So to answer your question, yes. I read a lot of books and social media and this passed the test for me. Nothing distinctly unhuman about 90% of this writing. Literally everyone in the comments thinks so too. Unless you’re promoted with “this article is written by AI,” I think most people are gonna go towards no.


oramirite t1_j9yjjlt wrote

But "to me" isn't the test. Hundreds of other people made the spot that you didn't.


ibringthehotpockets t1_j9zbk7y wrote

I would say a majority of people did pass it. Based on the comments, which are certainly more biased towards spotting it (they’re also going to be self-selecting educated/readers). At least at the time of posting my previous comment, there were many upvotes comments discussing it as if it were real. There’s certainly more rigorous tests that should bf done obviously, but even getting a 50% result posting to a philosophy board would probably make you think that posting it to the populace could only increase that number.

If 90% of people pass it and 10% doesn’t, does it pass lol? I mean I would think yes, I don’t see why not.


oramirite t1_j9zcoan wrote

Give me a break, eye-scanning a reddit thread for rough percentages is NEEEEEVWR going to be a scientifically sound sampling method. You'd have do actually do the test according to the actual specifications of the test. Anything else you wanna try to bend into being the test .... isn't.

What is being marketed and developed as "AI" is garbage, and we should all rally against it as being a solution for a problem that doesn't exist.


GepardenK t1_j9y9tth wrote

> Did we just pass the Turing Test?

Well, no. Even if everyone thought this article was written by a human that would not pass the Turing Test.

The Turing test requires two participants to engage back and forth in conversation, one being human and the other an AI, and then for a third party to watch the conversation in real time and not be able to distinguish who is human and who is not.

It is a significantly higher standard than simply confusing some algorithmic text for having been written by a person.


oramirite t1_j9yjgdt wrote

No, because it's a trash article that everyone spotted right away


cark t1_j9wtrdj wrote

> despite the authors wordy arguments AI requires input and only that input (however jumbled) will be returned on a query. if it were really creative it could dream up something on its own and populate a blank sheet of paper with something novel. AI isn't creative. the people that program it might be.

I'm not saying AI is there yet, but I have to disagree there. What would be the, presumably magical, property of the human brain that would make it work outside of its past input ? We also are merely jumbling the input to produce our output. Part of this input is innate, part is learned or sensed, and part is randomness. If the creative output is the result of a creative process that takes place in the brain, that computation is still a physical process. That process does take place in the physical realm and as such must be the result of some initial conditions.

That "jumbling" you're dismissively referring to is how we eventually got to be humans in the first place. The highly evolved, and selected for, brain we enjoy is the product of such a process. Not only that, but the brain also works that way too ! Besides the input data I evoked earlier, we're subjected to randomness by the very act of perceiving that same input. We're directed jumbling machines ourselves.

Current AI algorithms and model sizes may not be up to par yet, their creativity remaining quite benign. But this is creativity nonetheless.


oramirite t1_j9yjwic wrote

Even with your decent points in mind - no, it's still not creativity. The complexity and self-generstive qualities must be there. I know your point is that it will "get there" but to your point, it is not there yet. So no, it doesn't qualify as creativity because it's only a system that simulates creativity.

I realize you're still claiming that human creativity is still just a rehashed bundle of inputs but we don't have the complexity I'm AI to actually perform this action, therefore it is not there yet.


ElleLeonne t1_j9x51rm wrote

> despite the authors wordy arguments AI requires input and only that input (however jumbled) will be returned on a query. if it were really creative it could dream up something on its own and populate a blank sheet of paper with something novel. AI isn't creative. the people that program it might be.

My only significant gripe with this is, isn't this exactly how humans work? Everything we do is slightly derivative, and built on what came before us. All of our output is due to the input from our environment.

This isn't to say anything about your argument. I just feel like AI and humans are only truly separated by superficial boundaries like scale and implementation, and maybe we should consider this as the technology continues to advance.


rhyanin t1_j9xqkn4 wrote

Kinda, but I believe that there’s a difference. I think it works like this. Humans have the ability to understand concepts and derive new things from those concepts. AI, at this point at least, hasn’t. It can only derive from snippets of information without understanding how they connect. Therefore it can not make a truly new, unique thought.


GreenTeaBD t1_j9ynka0 wrote

The human brain, as far as we can tell, requires input to be creative too. It's just our senses. Making creativity into anything else is basically calling it magic, an ability to generate something from nothing.

This does not have to be a person typing prompts for ai, it just is because that's how it's useful. I've joked before about strapping a webcam to a Roomba, running the input through clip, and dumping the resulting text into gpt. Theres nothing that stops that from working.


Quizik t1_j9z8j2e wrote

Yes, I'm not sure we can speak of creativity when, in fact, from the datasets the machine is trained on to the infrastructure to the programming to the electricity "it" needs to be provided- everything, actually. And the "output" can't even be considered a "new" creation (even if it tricks us by having not "existed prior") in the sense as that I think it would be *derivative * definitionally. It cannot create anything that isn't a parrot-with- additional-steps, rehash, except wherein we give it IChing/magic 8-ball/ouiji nudges.

The tech is undoubtedly powerful, and the ramifications cannot be understated, but the anthropomorphization (understandably, pareidolia by another sense) going on as far as what people are willing to ascribe to "it", I think is being overstated often (if I'm allowed to make a generalized and spurious statement).

Is the Abacus doing math, if math is "done o it/using it", and in its "end state" it looks like it represents a number?

It's a simulation if we ascribe it any "entity", but since the simulation is being done with language, it is invariably degrees of difficulty harder for most people to "counteract" the illusion ("it says!" [but then, we are talking about a people who casually ascribe that manner of agency to even a collection of unrelated books, ze bible sez]).

So it's like making yourself dizzy and saying bloody Mary three times before a candle-lit mirror, it might seem spooky if your mind is playing tricks on you, but you are alone in the bathroom.