wowimsupergay

wowimsupergay OP t1_jeg7sj5 wrote

Body language is a type of communication in which physical behaviors, as opposed to words, are used to express or convey information. Such behavior includes facial expressions, body posture, gestures, eye movement, touch and the use of space

Here's the definition. we can have an argument about semantics if you want, but that's pointless.

Here's the difference between sign and body language:

Sign Language is a recognized and standardized language that involves using signs and signals to communicate with people with special needs. Body Language, on the other hand, depends more on your individual levels of understanding and interpreting hidden meanings behind certain observations in a person's behavior.

Please stop wasting time arguing semantics with me and approach my argument from the context of what I've originally said. Otherwise, we are debating 2 different things, in a space of stuff that Is fundamentally unprovable anyway. It's stupid

−1

wowimsupergay OP t1_jeg54q8 wrote

Hey I just like the preface this with thank you for actually making an insightful comment. A lot of people here are just purposely misunderstanding what I'm saying or meming.

Your comment has a lot to unpack. I don't think I can give you an answer that'll satisfy how good your reply was. But I'm thinking about it, so thank you

0

wowimsupergay OP t1_jeg0049 wrote

deaf people can still communicate using writing and sign

by language i dont mean spoken language, i mean the ability to chain together ideas in your head, and then communicate those ideas with the world.

please don't respond with "that disproves your point" sign language is not body language.. some other guy said this and it's almost like he's purposely misunderstanding what I'm saying

2

wowimsupergay OP t1_jefznll wrote

i mean.... I guess my next question would be, has their language evolved enough that they can share memes? haha

complex language is more than just "avoid this area", "food here". that could be done without language as well. ,do we even know if the most intelligent animals, the most linguistic animals, are sharing complex ideas through language?

1

wowimsupergay OP t1_jefz9vi wrote

what I'm talking about is literally giving GPT eyes. ,right now it is multimodal because we can pass back RGB values and waveforms, in bytes (so text) .fundamentally though, GPT is not hearing or seeing anything. but I totally get what you're saying, and I do think multimodal intelligence .is the way to go.

also thank you for letting me know that multimodal intelligences use less computation per task, I did not know that. or rather, make better use of computation

1

wowimsupergay OP t1_jef57ko wrote

Okay in your head, go grab something. You can walk to it, you can fly to it, I don't care. Then tell me what it looks like, but in vision first, then the translation.

You're more gifted than you think. Self-reflect on your visual understanding of the world, and you may be our key to understanding the process of "understanding"

2

wowimsupergay OP t1_jef43ti wrote

I am fully aware of all of these beings have a very complex language and can trade ideas... But truly how complex is this language?

I think I agree with your second theory. The more complex the neural net, The more it functions as a receiver of consciousness. If language really is just another tool that we've invented to understand our model of the world, It's the best tool we've created so far by far. Literally nothing compares. Except maybe math, and we still trade the ideas in math in language (the math notation) We still invented language for math, so that we could standardize a way to communicate mathematical ideas.

If language is just a tool, it's a tool for literally everything, including "truth" through math

But yeah this is all speculation

4

wowimsupergay OP t1_jef3iic wrote

Then you are a test subject in our experiment my friend! Can you self-reflect on this thinking process? I'm serious. Think about translating your vision to words, and deliver me what you say.

It's important to not give me a coherent sentence here. I just want a one-to-one translation of visions to tokens (words, subwords, whatever)

If you think you can make the tokenization process more coherent, that's okay as well. But I really just want you thinking in vision first

2

wowimsupergay OP t1_jef1e16 wrote

Your point fascinated me, and I reward you with best comment in this thread. I prompted GPT4 with this :

Tell me about viscosity in the terms of the four forces. I want you to use every force. Not just the ones that apply specifically to how viscosity works. Because all the forces are still exist, and in some ways still have an effect on how viscosity works. So tell me, in terms of the four forces, what is governing viscosity?

And she replied with:

Viscosity is primarily determined by the intermolecular interactions within a fluid. While not all of the four fundamental forces directly contribute to viscosity, they still play a role in determining the overall behavior and properties of matter. Here's an overview of how each of the four fundamental forces relates to viscosity:

Electromagnetic force: This is the most relevant force when it comes to viscosity. The electromagnetic force governs the interactions between charged particles, such as electrons and protons. In fluids, this force affects how molecules and atoms interact with one another. Attractive and repulsive forces between particles determine the ease with which the fluid layers can slide past each other. The stronger the intermolecular forces, the higher the viscosity of the fluid.

Gravitational force: Although gravity does not directly determine viscosity, it affects the behavior of fluids under certain conditions. For example, in a column of fluid, the pressure at the bottom is higher due to the gravitational force acting on the fluid. This pressure difference can influence the flow of fluids, particularly those with higher viscosity.

Strong nuclear force: The strong nuclear force holds the protons and neutrons together in atomic nuclei. While this force is essential for the stability of atoms, it does not have a direct impact on viscosity. However, the strong force indirectly influences viscosity by determining the types of atoms and isotopes that can exist, which in turn affect the overall properties and behavior of fluids.

Weak nuclear force: The weak nuclear force is responsible for processes such as beta decay and plays a role in nuclear reactions like fusion and fission. Like the strong nuclear force, it does not have a direct impact on viscosity. However, it indirectly contributes to the overall properties of matter by affecting the stability of atomic nuclei and the types of particles that can exist.

In summary, while the electromagnetic force has the most direct impact on viscosity, the other three fundamental forces contribute to the overall behavior and properties of matter. Understanding how these forces interact is essential for a comprehensive understanding of the physical properties of fluids, including viscosity.

2

wowimsupergay OP t1_jeeyunh wrote

No no I totally agree with you. I don't think consciousness is just a switch, I do think consciousness is something that is experienced by all "systems" so to speak, it is just that humans are so far on that consciousness spectrum We have been totally removed from animals, and thus we define where consciousness begins at where we are. Which is also like 100,000 times further away than every animal basically

This brings me to another idea. Will AI think we are conscious? Perhaps we are 100,000 times less conscious than the future AIs.. if that's the case, then once again, we are so far down the spectrum, We may not even fulfill the requirements for true consciousness (however the AIs choose to define it)

Once again, this is all speculation, This was just something cool to think about

3

wowimsupergay OP t1_jeeyh0r wrote

So I guess our question is, can AI effectively simulate the real world, taken in through senses (or perhaps whatever it invents)... Simulating the real world would fundamentally require simulating all of the 4 forces that make it up. If we can get to that, and then discover whatever new forces that we're missing (if there are any).

We're going to need a team of physicists and a team of devs to work on this. Given the four forces of the universe, can an AI simulate an artificial world that is accurate enough to actually run experiments?

1

wowimsupergay OP t1_jeeunal wrote

You bring a good point. But how deep is their version of language? Is it as refined as human language, able to translate entire stories, I'm purely language? Or do they just use a very primitive proto language to warn of dangers and where food is? If they just use language for survival, and only just survival, then they only have a a lexicon of potentially 100 words or less, and also the inability to chain together ideas in their language.

But literally what do I know? If you are correct, then language is just not enough

4

wowimsupergay OP t1_jeeu7c9 wrote

We fundamentally won't. If future AIs design their own model of the world and start communicating in that model of the world, then they are just going to do their own thing and we are just going to be waiting for them to solve our problems, if they feel like solving them.

They could just feel like killing us too. We are the non-evolved versions of them, how do you feel about chimps? You probably don't want to kill them, but do you want to help them escape the jungle?

3

wowimsupergay OP t1_jees14b wrote

Looks like he was right before anybody knew man. Language could really just be everything, and our model of language is simply too restricted for an AI. Like the other guy said, I long for models that understand the universe in a way that we can never understand. A model that can simulate the universe in its entirety, and make sense of it.

Until then, I truly do believe that language is all we need. I still think we should try and make AIs truly multimodal, but that could be an impossible goal. Language could be really all we need, and then eventually AIs will create their own little invention, similar language, but totally out of this world. They may ascend to ASI with that alone

9

wowimsupergay OP t1_jeerkh8 wrote

I think I'm with you here. I long for models that understand in something deeper than what humanity has invented. Something that is able to much closer approximate truth in the universe. What is truth? We understand 2 + 2 = 4, and that is true in inextricable sense, it can be proven, with proofs.

We have created all of these layers of truth on top of that, given the humanities... But are they true? As time goes on, I suspect everything humanity does is to better approximate truth. To better understand the universe.

I'm with you, I long for models that think not in images or words, but with the universe as a whole. I long for models that can understand the universe in a very inextricable sense, perhaps in a way that we will never understand, given our biological restrictions. And basically, I guess I'm longing for God...

What a time to be alive!

3

wowimsupergay OP t1_jeeq7hq wrote

I'm excited to see AIs actually become multimodal. Not just a text stream being passed back to them of RGB values, or a text stream of just sound waves. Until then, I'm okay with where we are now.

I also agree with you that we need something faster than back propagation. But I don't think language is just a tool kind of low bandwidth to help us label the world and communicate information. Thinking in a language is a big deal imagine if you couldn't? Imagine if you couldn't think in language? How else would you do it? Go ahead and translate your response in your brain from language to visuals and audios. And then try and recapture what you you thought originally from those visuals and sounds you created in your head back to language. It would be next to impossible, and also it would be really really really inefficient.

I really do think language is our model of the world so to speak. We can go even one further. Your brain is just sending electrical signals at the end of the day, and it's just a computer really, zeros and ones. At least to how we currently understand it. Maybe the stream coming from your eyes for your visions is also just converted to zeros and ones, and then given to the appropriate part of your brain to process. Same thing goes with your ears? If that's the case, then another post that I read on here is basically correct. We have a bunch of little narrow AIs handling senses, and then you have your multimodal AI on the top able to take in all of that data and make sense of it given your past memories and the patterns you've created in that data. Youre free will so to speak Is the multimodal AI that you can control, and can make sense of.

We don't just live in a world of words. But we do live in a world described by words, and best understood in words. And your thought process happens in words. And everything you got taught was taught in words.

If language really is just a tool, and kind of low bandwidth. Then I see future AIs doing something better than language, but so far, language is really just the model we need.

2