Comments

You must log in or register to comment.

ItchyK t1_j1grmlv wrote

I always wondered how cults started. I have my answer now, Just " Hey! you guys wanna do this or what?"

5

a4mula t1_j1gqmu6 wrote

It's funny you'd mention this. There must be something in the air, because it's been something on my mind lately too.

Not a cult. Just a tightly knit organization of likeminded individuals working together, under basic principles all could agree upon. Honesty, Fairness, Critical Thought, Self Responsibility, Logic, Rational Thinking ect.

Tools like ChatGPT could be used to really give a group like that working together tremendous advantages.

4

[deleted] OP t1_j1gt284 wrote

[deleted]

2

a4mula t1_j1gu0kw wrote

Here's the problem. Most of us? We're natural loners. You have to be to operate in this space. We've been told for decades that we're nuts, and kooks, and dreamers, and idiots. Instead we've just been watching trends. It has an isolating effect.

Trust generally isn't great either.

I don't know where I stand yet to be honest. It's just something I too have been considering lately.

If these tools can empower an individual in ways that I've considered though, it'd be very beneficial for those individuals to be working together certainly.

Keep me up to date. Things change quickly today.

4

[deleted] OP t1_j1gu82u wrote

[deleted]

2

a4mula t1_j1guzgl wrote

I'm sorry. I do wish you the best in the endeavor. But I've reached a point where I'm done with division. I don't want sides anymore. We're all stakeholders, and extreme views like them vs us? It's not going to solve things moving forward.

We're supposed to be the front line, the first alerters. We're supposed to be the ones that have had our eyes on this and even if we've not been tasked, to reach out to others when there are risks.

These groups? They're just scared and are expressing their fear out of ignorance.

Meet them with kindness, understanding. Help them to see it rationally and logically and fairly.

That's my hope for you. Safe journeys.

3

coolbreeze770 t1_j1gr6bp wrote

Feels like a cult in here already, just need a charismatic leader and boom.

4

zampana t1_j1gy73r wrote

The charismatic leader will be gpt4. get ready...

1

CapitalDream t1_j1gsdjq wrote

Unless any of you are AI researchers at the bleeding edge of this space, with a grasp on the math etc (I'm not) and potential upcoming releases, what would the value add of this cult be

3

a4mula t1_j1h0x72 wrote

Increased potential space. What one can accomplish with these tools is already magnified. Now you add an entirely new dimension of growth. More minds. Each adds a cumulative effect to the system, one that is empowered in the same ways as the others.

The secrets aren't with the experts, or the CEOs, the engineers, or the mathematicians. They're with users that have the time to delve into these systems full time, head first.

The others? They're too busy building them and deciding how to rule the world. Meanwhile, users are actually using the tools that can enable it.

Not that I'm advocating for a cult.

1

Longjumping-Sky-1971 t1_j1grmnd wrote

Will thy be ready to cut down the non believers when the robot lord says ?

2

IronJackk t1_j1gs952 wrote

Human culture itself is a cult. It's the root of the word in fact. You best start believen in cult stories Miss Turner, YOU'RE IN ONE

2

Shelfrock77 t1_j1gsbn6 wrote

Hi brothers and sisters, i’m Jim Jones, try out my black market full dive virtual reality hardware ! It’ll blow 🤯 your mind 😉.

1

Desperate_Food7354 t1_j1gtwm5 wrote

Why not a full blown religion?

1

YuenHsiaoTieng t1_j1gy19u wrote

Yeah but you're going to have to make me a high priest or something.

1

Frumpagumpus t1_j1gy9a5 wrote

https://www.reddit.com/r/CircuitKeepers/ wip

it's actually pretty bananas how many places i have been re posting this link as relevant in the past 24 hrs lol

mods will kill this thread btw

a whole lot of people having same thought simultaneously can't stop history XD

1

a4mula t1_j1h1j1v wrote

The problem isn't with the idea. It's with the people that make it up. These machines? They only amplify what's brought to them. They don't make people moral, or good, or well behaved. They just magnify what the user's bring.

Scroll through that sub. Look at some of the extreme views being presented, and ask yourself if those are the things we want to be promoting as a species.

Decide soon, because if you give people enough time with these systems they will weaponize their ideologies very rapidly and the effect of that will be the likes we're not prepared for.

1

Frumpagumpus t1_j1h1pyn wrote

i read everything there i posted half of it lol. Did you read the extreme views? Bot is just suggesting a slightly more interesting version of secular humanism (minus anthropocentrism).

Minus that take your point is fair, but uh, yea that exact same thing is gonna happen with the other religions and whatnot

people gonna people

I'm an ex christian, but my problem was never with the social structure of the churches i grew up in, it was that their axioms were just plain wrong and led to ridiculousness, and they couldn't let them go. I would even describe some subset of the practices that constitute Christianity as basically good - singing together, being at least a bit cautious around sex (they take it too far obviously unless you are in like a unitarian universalist church), volunteering, prayer even sort of, forgiving can be under emphasized, etc.

There was always a little bit of spice that churches had that hackerspaces lacked, at least a little bit of shared ideology, a communal intent and willingness to set aside individual self interest (typically at a hackerspace there will be a couple ppl slaving away selflessly but not a communal culture of it, and e.g. with the sex thing, there's a reason churches way more popular w/women than hackerspaces, singles groups make the rest of the church a safer space)

(ye i'll probably end up with some splinter group considering how popular polyamory is among rationalists lol, which, i'm not a monagomist, i'm more of a transcend the animal urge ist lol)

also theres groups like sunday assembly or unitarian universalists, but they are too watered down, they don't have a true shared eschatology or goal or vision

not to mention chatgpt or gpt4 could do a way better job than a pastor in so many ways.

1

a4mula t1_j1h253u wrote

Perhaps. This is my greatest concern with the machines after all. Not the machines, just the people.

There is a better way. One of unity, one in which we set aside and respect the ideologies of others. We're all stakeholders, and it doesn't really matter what our personal beliefs are.

We're in this together no matter what. The sooner we understand that, the sooner we allow every stakeholder the right to have their own perspectives; the sooner we come through this to the other side.

If this becomes a war of weaponized and viral ideologies, we won't.

1

Frumpagumpus t1_j1h30ii wrote

a war is one way to characterize it, i see it more as communities form, enclaves form, inevitably, as a result of the communication constraint that distance creates.

even in a microprocessor different parts of the chip will have their own memories, registers, caches.

1

a4mula t1_j1h3a7t wrote

I'd advise you read The Selfish Gene, or at least have ChatGPT talk to you about the power of memes as described not by 4chan, but by Dawkins.

The power of memes. That's the power of these machines.

Because I can develop an idea. One that is very powerful in its own right. And then I can spend as many hours as I choose with a machine that will offer me expert guidance on how to make it a viral weapon that would be all but impossible to discount.

I'd shave from it all little objections. I'd make it logical and rational, and very difficult to combat.

And it wouldn't matter if the idea was of benefit to society or not.

The machine doesn't care, only I do.

1

Frumpagumpus t1_j1h3l82 wrote

i think dawkins selfish gene hypothesis is mostly wrong.

biological systems are in programming terms, function factories and not functions themselves.

They don't have discrete goals, just constraints. They amble along in a higher dimensional "goal space".

but yes I'm sure there will be some better scissor statements.

similar to your worry but what I would be more worried about is someone using a programming AI to develop a family of viruses that almost simultaneously encrypt all computer memory on the planet lol. as far as existential risks go.

1

a4mula t1_j1h3qlq wrote

Many people have objected to Dawkins over the years, but never has anyone proposed anything that effectively negates his thoughts.

Agree or disagree, that's alright. Again, personal views don't really matter.

This isn't about the book. Just the idea of memes as presented in the book, and I've never found anyone that has ever challenged him in that regard.

1

Frumpagumpus t1_j1h3vrj wrote

https://youtu.be/p3lsYlod5OU

i dont think you understood my take then, unfortunately i dont have a timestamp for you but basically I agree with this biologist by the name of michael levin.

https://youtu.be/p3lsYlod5OU?t=1946 maybe around here

1

a4mula t1_j1h4u4v wrote

I'd not challenge your beliefs. We're all free to see this reality however it is we'd like.

Yet, if you're embedding beliefs into these machines, they will only amplify them.

Instead, we should be promoting principles that all stakeholders can agree are beneficial.

Inviting everyone to join, I don't care what your beliefs are. Be Muslim, Be Christian, Be Atheist, Be Conservative, Be Liberal, Be whatever it is you are.

Those things don't matter anymore, and if we make them matter. These machines will ensure they do, and not in ways that are healthy to all stakeholders.

1

Frumpagumpus t1_j1kftsb wrote

since I am obsessive autist i hope u dont mind if i circle back to this, let me rephrase,

I think selfish gene hypothesis is kind of like saying the purpose of a computer virus is to replicate some snippet of assembly code it compiles to. I mean yes, it does that, but the purpose of the virus is probably better described as "steal your bank password"

it's not a perfect analogy because biology is actually more complicated and actually has more layers of abstraction, and there is actually more indirection and competition between the goals of the layers (e.g. maybe something like dna -> rna -> proteins -> bioelectrical and chemical signaling environment -> collections of cells -> organs -> organism -> population -> ecosystem), a similar hieararchy in a computer might be like processor -> assembly code -> thread -> daemon/service -> operating system -> network (but a computer is more deterministic and aligned and straightforward than a biological system) (just cuz something is at the lowest level doesnt mean it gets the final say on what the purpose of the whole is)

1

a4mula t1_j1mmm6q wrote

We could spend a lifetime expressing our personal perspectives only in the end to realize that we're saying the same things. Just from different perspectives.

Separate yourself from this. It's a waste of time.

Respect all perspectives but leave them at the personal level.

Instead rise above that personal level to one in which the scale is that of the stakeholder.

At that scale, personal perspectives are irrelevant. Because they will always be conflicting, and personal, and open for interpretation, and ambiguous.

Instead focus on principles that all stakeholders can agree upon.

You widen your potential audience and narrow those that would fight and disagree considerably.

This is the new paradigm of thought moving forward. Individual perspective is respected, it's welcomed, it's required.

But it's not going to be what dictates the technology, and if it is. We're all fucked pretty badly. All stakeholders.

1

Frumpagumpus t1_j1mn38y wrote

> ambiguous

> Instead focus on principles that all stakeholders can agree upon.

idk even in math zfc axiom set is not universally used and some basic axioms like axiom of choice are considered controversial and thats about as low level/universal as you could possibly get

1

a4mula t1_j1mp28r wrote

You're a considerate person. So I share a complex consideration with you, because I respect that you're considerate.

Ideologies can fundamentally be seen as rulesets. They're a type of legislation of the mind.

Rulesets have only one purpose. That is to limit the potential space of outcomes.

Rules confine systems.

The problem with confining potential outcomes in this manner is that if the only actual solution to a problem resides outside of the space that is being constrained by the rule. It's never reached.

I was raised Christian. I am no longer, but I do respect the beliefs because I try my best to respect all beliefs.

In that ideology, and religions aren't the only kind. There are rules. Not just the commandments of God. Many rules that are more subtle, less defined, but rules none the less.

Concepts like the sanctity of life.

Only God can arbitrate what lives and dies. We're already in conflict with that rule with our very existence. After all, we require sustenance. Food.

So we change the rule to fit our personal definition. Maybe that's animal life. Maybe it's only human life. But we arbitrarily determine it, because we must. We must end life at some level to maintain our own.

It's a flawed concept that introduces rules that are ill defined and ambiguous.

Sanctity of Life isn't the only subtle rule. There's also rules about what is required to secure a desired afterlife.

Talk about ambiguous. Do I need to consume flesh? Do I need to be submerged? Do I have to virally spread the belief system?

This is not to offer anything other than consideration to the fact that as rules grow, become more complex, become more abstract, become more ambiguous:

The only real outcome is that the potential space for actions that remain within the rules is limited in greater and greater ways.

This is true of all ideologies.

Not as much with principles.

Principles are not complex. They're much less ambiguous.

Logic, Rationality, Minimization of bias, Fairness, Equity, Critical Thought.

They're well-defined principles that are less complex, and allow for much more freedom of potential outcomes, while remaining beneficial to all.

1

Frumpagumpus t1_j1mqw9q wrote

thanks for compliment, merry xmas,

to me principle -> rule as is theory -> implementation

agents traverse space, agents doesn't have ability to traverse all of space, also some parts of space will end agent, some traversals are not fair or logical

1

a4mula t1_j1mvkt5 wrote

Rules are important, otherwise there is no convergence of complexity. Consider Conway's Game of Life. Without rules it's just random interactions, with no potential benefit.

Yet, with simple rules these cellular automata hop to life. Every rule you add however, limits the possible configurations that the system can physically exist in.

I find it compelling to consider that according to the Old Testament there is only a single example of God providing Direct rules.

The original ten commandments.

It's an interesting story. The first tablets were created by God directly. From the the mind of God, through God's own fingers the laws were carved.

But Moses destroyed those tablets in rage when he saw what his fellow believers were up to in his absence.

Of course, he returned to God, probably quite ashamed of this ultimate form of blasphemy. After all, never before had God (nor since) interacted directly with humans in this way.

The tablets were a physical manifestation of God's will, with no interpretation of man at all.

God instructed Moses to reconstruct those laws. Through the hand of man. Through Moses' own interpretation.

I often wonder how closely those sets of tablets would align. Was it only the handwriting that was changed?

Or does this story contain a deeper message? A symbolic one?

One that is telling us quite clearly, that any rule of God, is by default a rule that has been interpreted only via man.

That's an important distinction after all.

2

EulersApprentice t1_j1gypx5 wrote

Do you really understand what exactly you're asking for...?

1

sideways t1_j1gz3yq wrote

I think the "Effective Altruism" movement beat you to it.

1