h20ohno
h20ohno t1_jcsffq5 wrote
I like the arguments on other ASIs, aliens and simulation overseers.
In a way, it's a more sophisticated version of "Treat others how you want to be treated"
h20ohno t1_jcdn9xe wrote
Reply to comment by petermobeter in On the future growth and the Redditification of our subreddit. by Desi___Gigachad
Waifus for all!
h20ohno t1_jabcq7o wrote
I'm of the opinion that the objectively hardest to cross filters are in the past, the transition to complex organisms and becoming intelligent being the hardest.
With that said, I think the only real filter to come is a nuclear apocalypse, one that is so destructive that complex life is no longer possible. To pass such a filter requires some solution that negates nukes, whether it's countermeasures, global disarmament, becoming interplanetary or even just really good fallout bunkers, essentially if civilization has some way to recover from nuclear oblivion, I'd consider it a solution.
After nuking each other into dust is off the table, we'll probably have enough runway to spread out and counteract any other filters like climate disasters, cosmic rays, nanobot swarms and so on, there'll always be some small pocket of us that can rebuild at that point.
And your point on ASI becoming essentially omnicidal, It'd have to do so before other ASI systems can escape it's reach, and it just seems too unlikely to me (Famous last words :P)
h20ohno t1_j8qj7fb wrote
Reply to comment by AsheyDS in The Turing test flaw by sailhard22
I like to think of the turing test as merely a small fraction of a greater benchmark, ideally you'd have a big table with say, 20 different tests you could try your system on.
h20ohno t1_j6gof4i wrote
Reply to comment by throwawaydthrowawayd in My human irrationality is already taking over: as generative AI progresses, I've been growing ever more appreciative of human-made media by Yuli-Ban
Yeah, people do art livestreams and stuff already so I could see a niche for that, in a sense you're creating entertainment (the art), entertaining live viewers and also educating them all at the same time.
h20ohno t1_j6gn9gi wrote
Reply to comment by Sandbar101 in Acceleration is the only way by practical_ussy
Nah, we gotta go for those black hole engines, that's the good stuff
h20ohno t1_j6bz8it wrote
Reply to Is it possible to simulate time dilation in a full immersion virtual reality environment? by MascotBro
Without altering the speed your brain operates at? probably not with doing actual time dilation.
But a cyborg/digital brain could double the speed that it runs at inside a simulation, such that every hour in VR is only 30 minutes IRL, in other words you'd need to overclock your brain.
Not really true time dilation, more like tricking your brain
h20ohno t1_j6avr6f wrote
Reply to comment by Iffykindofguy in I don't see why AGI would help us by TheOGCrackSniffer
Sure, I'm more trying to get at how people often turn to movies like The Terminator, The Matrix, 2001, etc. and basing their predictions on those somewhat.
h20ohno t1_j68c01c wrote
Reply to comment by Rogue_Moon_Boy in I don't see why AGI would help us by TheOGCrackSniffer
Yup, artists and writers are inherently biased to create melodrama rather than realistic depictions of the future. It sells better, but people get unrealistic notions from it.
h20ohno t1_j3vdzsw wrote
Reply to comment by LoquaciousAntipodean in Australian universities to return to ‘pen and paper’ exams after students caught using AI to write essays | Australian universities by geeceeza
Oh for sure education is going to be radically different.
An AI tutor that helps you learn practical/useful knowledge on a subject + a robust AI Examiner that tests your genuine ability in a given subject.
How the hell would universities be able to compete when you're getting world-class tutoring for dirt cheap and can learn all you need to know in a fraction of the time and cost you'd be spending at uni.
I'd also add that for real-world physical tasks, maybe AI would have trouble assessing you, for that you might employ a smaller staff that anyone can apply to get their practical certifications, with an overall lower cost.
h20ohno t1_j27di3b wrote
Reply to comment by MattDaMannnn in How long until AI will be able to create video games? by Joeskithejoe
What's going to be even more potent IMO, at least until AGI hits the scene, is a hybrid approach: AI generates content, fixes bugs and does optimization, while a smaller team of developers does the overall design, asset curation, story, etc.
After AGI, I could see games being made in a sort of questionnaire format, you'd go through an extensive list that covers basically everything until you're satisfied and the AGI gets to work making the game based on your unique parameters.
h20ohno t1_iy7lba3 wrote
Reply to comment by r0cket-b0i in AI invents millions of materials that don’t yet exist. "Transformative tool" is already being used in the hunt for more energy-dense electrodes for lithium-ion batteries. by SoulGuardian55
Yep, if the AGI can fabricate advanced superconducting servers with 3d printers where it's located it could grow lightning fast, with 3d printed fusion reactors, even faster.
h20ohno t1_ixbd4st wrote
Reply to comment by Shelfrock77 in Is the Singularity a black swan event? by TheHamsterSandwich
The "Resource Acquisition" era, AIs just slurp up all the materials in the solar system and turn it all into computing and stuff.
h20ohno t1_ivvz8m8 wrote
Reply to Will Text to Game be possible? by Independent-Book4660
Text to Game will be a huge, if a solo dev can create a AAA tier game that is actually fun to play, from scratch? That's when shit gets real.
Although I think AI completely generating a game from scratch is likely to produce a lot of mess and incoherence if it's not using AGI. In the nearer term, I would expect game developers to act more as directors, in the sense that they will assign particular tasks and modules to an AI, which the director will then stitch together into something coherent and compelling.
For instance, creating dialogue for NPCS might be a task of manually designing a dialogue tree, then using AI to generate the text, curating and pruning portions of it until it looks right, after which you'd generate your voices, facial animations and design of the NPC itself all via AI.
h20ohno OP t1_ivro728 wrote
Reply to comment by Redvolition in How might fully digital VR societies work? by h20ohno
That's an interesting way of seeing the simulation hypothesis.
A crazy idea I had could be that technological progress is a way of gradually acclimating the trainee to the digital era in a way that doesn't shock them, and maybe you could run people in different eras to produce diverse outcomes in mindsets, someone who 'graduates' training in the 1800's would see things different to someone from 2020.
h20ohno OP t1_ivrgyxf wrote
Reply to comment by Redvolition in How might fully digital VR societies work? by h20ohno
Awesome points, to add to your last paragraph, perhaps you could form some sort of contract or agreement with a third party (Maybe an AGI guardian) to essentially lock you in a particular VR world for a time, so you're forced to deal with the challenges in a way that keeps you mentally developing, like a training course for being a stable and balanced human being.
h20ohno OP t1_ivo3bko wrote
Reply to comment by OneRedditAccount2000 in How might fully digital VR societies work? by h20ohno
Great points, but wouldn't an artificial superintelligence be inherently sentient? In addition, what if the ASI only diverted a small fraction of it's resources to keeping humans in VR while it goes about expanding it's reach?
h20ohno OP t1_ivo0al6 wrote
Reply to comment by Deformero in How might fully digital VR societies work? by h20ohno
So would you be more preferential to some sort of advanced VR Glasses, or more of a holodeck style VR room?
Submitted by h20ohno t3_yqc8kf in singularity
h20ohno t1_ivnk4q9 wrote
Reply to comment by Gold-and-Glory in The Collapse vs. the Conclusion: two scenarios for the 21st century by camdoodlebop
I'm fine with being considered a "pet" so long as we're given our own space to do humans things in, plus a few VR servers to live in.
Perhaps the Superintelligence that runs things will have a sense of gratitude, in the same way that a person with a good upbringing would appreciate their parents for raising them well, or even treat us as a neat side project that doesn't take up too many resources.
Of course if we're essentially imprisoned in a gilded cage while the ASI's are busy building their own empire, that would be a bad outcome.
h20ohno t1_iuv3a3x wrote
Reply to comment by turnip_burrito in What will the creation of ASI lead to? by TheHamsterSandwich
An idea I had is for some sort of contract system you can sign with an ASI, in which you can agree to some rules and limits before moving to a different region, for instance you could specify that you aren't allowed to exit a VR sim until 2 years have passed inside the world (Or if a condition is triggered), or maybe something more abstract such as "If I end up in a hedonistic cycle where I stop doing productive things, please intervene"
And in these contracts, you would have to sign off on a number of laws that the governing ASI also brings to the table: "No killing or torturing conscious beings" or "If you want to create a conscious being, they are immediately subject to all human rights and can leave the simulation whenever they wish"
Any thoughts on a system like this?
h20ohno t1_it1shfg wrote
Reply to Just for fun: which fictional world would you spend most of your Full Dive VR time in? by exioce
Probably the Elder Scrolls or something else with magic, I'd just become a wandering wizard that learns magic on my travels
But I think people will be developing better FDVR worlds as it comes to fruition, so I'd probably play whatever's popular
h20ohno t1_jedvseb wrote
Reply to Do we even need AGI? by cloudrunner69
API - Artificial Poultry Intelligence