Submitted by Tripwir62 t3_124veww in Futurology

Me: I'd like to tell you a joke.

GPT: Sure, I'd love to hear a joke! What's your joke?

Me: A man walks into a bar and the bartender says "I'm sorry, we don't allow dogs in the bar." The man says "it's a seeing eye dog." The bartender says "they gave you a chihuahua?" The man says "they gave me a chihuahua??"

GPT: Haha, that's a funny one! Thanks for sharing it with me!

Me: Why do you think that's funny?

GPT: As an AI language model, I don't really have personal feelings or emotions, including the ability to find things funny or not. However, I can understand that the joke is funny because it relies on the unexpected punchline. The bartender's question seems absurd, given that a seeing-eye dog is usually a larger breed. The twist at the end, where the man realizes that he's been given a chihuahua instead of a more suitable breed, is unexpected and humorous.

Me: Why do you think the man didn't know what kind of dog he had?

GPT: t's possible that the man in the joke knew exactly what kind of dog he had, but he was playing along with the bartender's comment to create a humorous moment. Alternatively, the joke could be based on the idea that the man is visually impaired and was unaware of the size or breed of his dog when he received it. However, it's important to note that this is just a fictional scenario in a joke and may not reflect the reality of actual seeing-eye dog owners or their experiences.

Me: That's a frighteningly good answer.

GPT: Thank you! As an AI language model, my goal is to provide accurate, informative, and helpful responses to your questions and statements. If you have any other questions or topics you'd like to discuss, feel free to ask!

0

Comments

You must log in or register to comment.

MindSpecter t1_je18tkg wrote

That's a remarkable example of it understanding how to structure a joke and understanding what makes it humorous.

That's different than the AI having a sense of humor, but just being able to understand it is remarkable in and of itself.

13

BaronOfTheVoid t1_je1a2eo wrote

Different in what way?

3

MindSpecter t1_je1lp6q wrote

"Having a sense of humor" implies that it appreciates the joke and feels the emotion of it being funny.

Chat GPT can analyze and identify jokes, even write jokes itself and tell you a joke is funny, but it is simulating a sense of humor, not actually experiencing it.

3

BaronOfTheVoid t1_je1rkg9 wrote

Well, emotions are nothing magical. Just chemical reactions that an AI can emulate.

3

Tripwir62 OP t1_je2a96u wrote

If a machine could correctly identify smells. Would it have a sense of smell?

2

MindSpecter t1_je2cip8 wrote

I think a better analogy is a machine that knows what smells are produced by items in a room and therefore can deduce how the room smells.

Chat GPT knows what produces a humorous effect and can identify it, but it doesn't experience it.

1

Tripwir62 OP t1_je2injl wrote

I wasn’t making an analogy. I was asking whether a machine that could identify smells would be regarded as having a “sense of smell.” My view is that I don’t think there’s much alternative. Happy to hear other ideas.

1

MindSpecter t1_je2mqng wrote

I mean, I assume you were asking that question to make an analogy to the topic of AI and humor, but that was an assumption on part.

To answer your question, I think it depends. How the machine is identifying smells matters to the answer I would give. If it directly detects particles in the air and processes them similar to a biological nose, then I would say the machine can smell. If it deduces what the smell is by pairing datasets of information about objects and how their smells are categorized with a picture of a room, it does not have the ability to smell.

I hope that clarifies my thoughts!

2

Tripwir62 OP t1_je2qbza wrote

Thanks. You're right. It indeed might get to exactly what you're suggesting. But for now though, the topic of smell is so unique, and separated from cognition, that I thought it might lead to an interesting discussion on its own. For me, I can't quite understand why I should care how it does it. If a machine can identify aromas, what is it that's stopping me from saying "it can smell?" If there's a difference in the process it uses, is that difference important? Why?

1

MindSpecter t1_je2sle1 wrote

That's an excellent question! I think it's worth clarifying since people tend to anthropomorphize AI and assume they are seeing a more human process than what is really going on.

We saw this a little with Siri and Google assistant being able to tell jokes and people saying "Siri has a great sense of humor." But really it is just running some preset jokes when asked.

With Chat GPT, it's looking over a bunch of text data and the algorithm is drawing words together based on patterns it has recognized. It will even type out laughter and respond like a human would. The illusion is rather convincing, but it is not processing the joke, internally reacting to it with a sense of laughter, and then expressing an emotion. It's just generating text that it predicts is appropriate for the situation based on its trained data set.

Saying "it can smell" or "it has a sense of humor" ascribes a level of experience that the system does not possess.

1

Tripwir62 OP t1_je3gmys wrote

ChatGPT takes the middle ground on this question:

Me: Suppose a machine was able to accurately identify all aromas. Independent of how it was able to achieve that, would it be reasonable for humans to observe that the "machine can smell?"

GPT: It would depend on how one defines the term "smell." If we define "smell" as the ability to detect and identify odors, then a machine that is able to accurately identify all aromas could be considered to "smell" in a technical sense. However, if we define "smell" as a sensory experience that is unique to humans and other animals, then it would not be accurate to say that a machine can "smell" in the same way that humans can.
The ability to detect and identify odors is just one aspect of the complex sensory experience of smell that humans and animals have. Smell involves not only the detection of odor molecules but also the processing of this information by the brain to create a perception of smell that is unique to each individual. This involves the integration of multiple sensory inputs and the interpretation of this information in the context of past experiences and memories.
Therefore, while a machine may be able to identify odors with great accuracy, it cannot replicate the full sensory experience of smell that humans and animals have. So, while it may be technically accurate to say that a machine can "smell" in a limited sense, it would be important to clarify what we mean by the term and recognize the differences between a machine's ability to identify odors and the complex sensory experience of smell that humans and animals have.

1

MindSpecter t1_je3h88z wrote

That's precisely what I've been trying to say. Maybe less eloquently. My definition involves the experience element, so I follow that line of logic.

I asked chat GPT if it has a sense of humor and it responded with: "A sense of humor refers to the ability to appreciate and find humor in situations, experiences, or verbal expressions. It can involve making others laugh, finding amusement in things that others might not, and being able to laugh at oneself.

As an artificial intelligence language model, I don't have the capacity to experience emotions or find things humorous in the way humans do. However, I am programmed to understand and generate jokes, puns, and other forms of humor that humans enjoy."

2

Alaishana t1_je6akbb wrote

In the end, you are asking 'What exactly is a qualia?"

And this is not answerable.

Not even... no: SPECIFICALLY, not by an AI.

1

PointyBagels t1_je1bvay wrote

I'd say a key difference is that it is largely unable to come up with new jokes. Only at explaining existing ones.

2

ItilityMSP t1_je1h2wr wrote

Nope it can come up with novel jokes, just ask it for some. Most of the one's it comes up with are play on word meanings. So jokes you read as a kid.

But still they are jokes. I'll ask it to tell me a funny story next time and see how it does.

1

InflationCold3591 t1_je1qcbu wrote

It does most “understand” anything. It is programmed with certain structures that allow it to recognize the rules of other structures it has not been programmed in. Humor is a highly structured language model, exactly the thing it is designed to mimic.

2

Cerulean_IsFancyBlue t1_je126y5 wrote

No.

Sorry, too short?

Both the program itself and the designers of it will tell you it has no sense of humor. There are some excellent white papers and podcast and articles written about it; how because it’s a language model, people will hallucinate a personality behind it. We live in a world in which anything that can produce that kind of language has some aspects of humanity, and it’s very very hard for us not to accidentally project at least some humanity into that system.

8

HongoMushroomMan t1_je1furk wrote

struck me, reading that 400+ page Microsoft report on GPT-4 (skimmed lightly haha) that its actually insanely unfair and unjust how the common people won't get true access to GPT-4 or its successors.

Real AGI or not, I want to revel in the experience, but I won't be allowed to because I don't have and won't ever have access to this new thing.

"ChatGPT" is now an absolute dinosaur compared to GPT-4. The comparison is even hard to make with out much more profound GPT-4 can understand and translate things into other ideas compared to GPT-3-based ChatGPT

2

comradelucyford t1_je4e1e3 wrote

It is my understanding that Bing Chat uses a customized version of gpt 4. It has obvious limitations but even it is scarily useful. I have it give me examples of how to use different APIs frequently and more often than not it is correct and more concise than anything I find with traditional searching.

​

Not the same as full GPT 4 and doesn't mitigate the concern of gating access to such a revolutionary technology but at least it is something for anyone out there who may be feeling left behind.

2

alecs_stan t1_je7lvk0 wrote

Just pay for ChatGPT Plus bruh.

1

HongoMushroomMan t1_jeatar0 wrote

Is that gpt 4?

1

alecs_stan t1_jeaujih wrote

Yeah, you can try it if you have a paid account

1

HongoMushroomMan t1_jefmmoo wrote

its still not the not-neutered version. Microsoft had GPT-4 tell how to hack into a network with specifics and it answered. So a lot of the fun stuff you can't ask it.

1

PointyBagels t1_je1b1y9 wrote

ChatGPT is not particularly good at coming up with new jokes, because often the process of creating a joke is non-linear. You usually think of the punch line first, then come up with the setup afterwards.

ChatGPT responses are constructed in a linear fashion, and it is not capable of this type of non-linear thinking. Considering this, I'd say that ChatGPT does not have a sense of humor. It can explain jokes right now, but is not good at coming up with new ones.

That said, I'm sure this problem is being worked on and there's a good chance in a few years that this will be possible. This line of research, I'm sure, will help with far more than just humor.

1

Ka_Trewq t1_je1ko56 wrote

Either I'm too tired, or the AI is better at me at getting the punchline, but I have to admit that I needed the explanation to understand it.

Man, there are some cool ST TNG episodes that are quite relevant in this fast changing AI landscape.

1

Tripwir62 OP t1_je2b849 wrote

This joke kills when you tell it live. Never seen it fail. Key is intonation on the last two lines: "they gave you a chihuahua?" "they game me a chihuahua??," each in the same interrogative tone.

1

Gunnar2024 t1_je521hz wrote

I tried something similar but the answers weren't that good.

1

megaed99 t1_je5nvno wrote

I had a similar experience with Chat GPT where I asked it to look at a selection of my poems and tell me what it thought each one was about. It did a remarkably good job at it.

1

InflationCold3591 t1_je1pxvw wrote

No. Because it is not meaningfully “intelligent”. Stop it.

−1