h20ohno

h20ohno t1_jabcq7o wrote

I'm of the opinion that the objectively hardest to cross filters are in the past, the transition to complex organisms and becoming intelligent being the hardest.

With that said, I think the only real filter to come is a nuclear apocalypse, one that is so destructive that complex life is no longer possible. To pass such a filter requires some solution that negates nukes, whether it's countermeasures, global disarmament, becoming interplanetary or even just really good fallout bunkers, essentially if civilization has some way to recover from nuclear oblivion, I'd consider it a solution.

After nuking each other into dust is off the table, we'll probably have enough runway to spread out and counteract any other filters like climate disasters, cosmic rays, nanobot swarms and so on, there'll always be some small pocket of us that can rebuild at that point.

And your point on ASI becoming essentially omnicidal, It'd have to do so before other ASI systems can escape it's reach, and it just seems too unlikely to me (Famous last words :P)

3

h20ohno t1_j8qj7fb wrote

Reply to comment by AsheyDS in The Turing test flaw by sailhard22

I like to think of the turing test as merely a small fraction of a greater benchmark, ideally you'd have a big table with say, 20 different tests you could try your system on.

1

h20ohno t1_j6bz8it wrote

Without altering the speed your brain operates at? probably not with doing actual time dilation.

But a cyborg/digital brain could double the speed that it runs at inside a simulation, such that every hour in VR is only 30 minutes IRL, in other words you'd need to overclock your brain.

Not really true time dilation, more like tricking your brain

13

h20ohno t1_j3vdzsw wrote

Oh for sure education is going to be radically different.

An AI tutor that helps you learn practical/useful knowledge on a subject + a robust AI Examiner that tests your genuine ability in a given subject.

How the hell would universities be able to compete when you're getting world-class tutoring for dirt cheap and can learn all you need to know in a fraction of the time and cost you'd be spending at uni.

I'd also add that for real-world physical tasks, maybe AI would have trouble assessing you, for that you might employ a smaller staff that anyone can apply to get their practical certifications, with an overall lower cost.

16

h20ohno t1_j27di3b wrote

What's going to be even more potent IMO, at least until AGI hits the scene, is a hybrid approach: AI generates content, fixes bugs and does optimization, while a smaller team of developers does the overall design, asset curation, story, etc.

After AGI, I could see games being made in a sort of questionnaire format, you'd go through an extensive list that covers basically everything until you're satisfied and the AGI gets to work making the game based on your unique parameters.

3

h20ohno t1_ivvz8m8 wrote

Text to Game will be a huge, if a solo dev can create a AAA tier game that is actually fun to play, from scratch? That's when shit gets real.

Although I think AI completely generating a game from scratch is likely to produce a lot of mess and incoherence if it's not using AGI. In the nearer term, I would expect game developers to act more as directors, in the sense that they will assign particular tasks and modules to an AI, which the director will then stitch together into something coherent and compelling.

For instance, creating dialogue for NPCS might be a task of manually designing a dialogue tree, then using AI to generate the text, curating and pruning portions of it until it looks right, after which you'd generate your voices, facial animations and design of the NPC itself all via AI.

4

h20ohno OP t1_ivro728 wrote

That's an interesting way of seeing the simulation hypothesis.

A crazy idea I had could be that technological progress is a way of gradually acclimating the trainee to the digital era in a way that doesn't shock them, and maybe you could run people in different eras to produce diverse outcomes in mindsets, someone who 'graduates' training in the 1800's would see things different to someone from 2020.

3

h20ohno OP t1_ivrgyxf wrote

Awesome points, to add to your last paragraph, perhaps you could form some sort of contract or agreement with a third party (Maybe an AGI guardian) to essentially lock you in a particular VR world for a time, so you're forced to deal with the challenges in a way that keeps you mentally developing, like a training course for being a stable and balanced human being.

3

h20ohno t1_ivnk4q9 wrote

I'm fine with being considered a "pet" so long as we're given our own space to do humans things in, plus a few VR servers to live in.

Perhaps the Superintelligence that runs things will have a sense of gratitude, in the same way that a person with a good upbringing would appreciate their parents for raising them well, or even treat us as a neat side project that doesn't take up too many resources.

Of course if we're essentially imprisoned in a gilded cage while the ASI's are busy building their own empire, that would be a bad outcome.

0

h20ohno t1_iuv3a3x wrote

An idea I had is for some sort of contract system you can sign with an ASI, in which you can agree to some rules and limits before moving to a different region, for instance you could specify that you aren't allowed to exit a VR sim until 2 years have passed inside the world (Or if a condition is triggered), or maybe something more abstract such as "If I end up in a hedonistic cycle where I stop doing productive things, please intervene"

And in these contracts, you would have to sign off on a number of laws that the governing ASI also brings to the table: "No killing or torturing conscious beings" or "If you want to create a conscious being, they are immediately subject to all human rights and can leave the simulation whenever they wish"

Any thoughts on a system like this?

3