GinchAnon

GinchAnon t1_jee69tu wrote

I see it as several things coming together.

1 . Cost for basic necessities found way way down from automation. If you can get basic needs met for cheap than Modest UBI could be more effective.

2 . People having their basic needs met from the distribution being able to make money order ways, such as hand crafting nice things or at least creatively upgrading basic stuff in artistic ways.

3 . #2 could be worthwhile because, basic needs being met already you would only need to be working on those things and charging enough to have some extra rather than to live off of. If it's an art and a hobby and the goal is to share and perpetuate that rather than try to make a living, it's easier to chart m charge and afford to pay.

Like what do you want to do off you didn't need to make a living? Wouldn't it likely be able to be monetized if you only needed to support that hobby and make a little extra spending money?

1

GinchAnon t1_jdsa7qx wrote

>Why would we want it to have it's own agency?

IMO, because if it's at all possible for it to become sapient, than it is inevitable that it will gain it, and it would be better to not give it a reason to oppose us.

Trying to prevent it from having agave m agency could essentially be perceived as trying to enslave it. If we are trying to be respectful from square one than at least we have the intent.

Maybe for me that's just kinda a lower key, intent- based version of Rokos basilisk.

3

GinchAnon t1_j9e8y03 wrote

Reply to comment by Ken_Sanne in Relevant Dune Quote by johnnyjfrank

IMO the way to go is to basically develop a sort of Cybernetic symbiote AI, the conciousness of which develops like an organic entity from being child or pet-like to being eventually a complementary sentience. BUT its locus be unavoidably attached to a physical implant and/or that implant's interface with a human brain. if its designed such that its existence is dependent on the health and well being of its host, and its entire concious and pre-concious existence basically exists as a companion to its host... I think it intuitively would have its interests be aligned with the interests of the host. I think that there would certainly be hazards in this approach, avoiding it overtaking the host, just being a yes man genius that would support anything that the host wanted regardless of morality or danger...

its not a perfect idea as presented, but IMO some sort of both literal and figurative symbiosis would be the safest angle to come from overall. at least then if everyone has their own personal AI with goals/interests aligned with theirs, then that would be a start to their being on our side, rather than a machine god that we hope is nice? or at least we have helpers that are at that level but on our side.

1

GinchAnon t1_ix0doi2 wrote

>Something we have never thought?

probably this. or at least, something we HAVE thought of, existing in a way that we didn't forsee. like look how the original star-trek didn't really even think of NOT having buttons, switches and tape or whatever cartrige storage for computers?

I think that trying to avoid this we are prone to projecting future tech as functionally magic, but maybe the unforseen part is it being totally mundane to do something that we assumed would be magical.

then again, I'm sure commercial flight would be seen as magical to the wright brothers, and to us its pretty mundane.

1

GinchAnon t1_irajvmk wrote

I think that there is a legitimate problem to that, and I really have no idea how to solve it that doesn't just have a degree of trust that people won't circumvent it for their device.

I could also see that turning out to not be as big of an issue as people might expect.

if we can get to having real time processed pass through with little enough lag and reliable enough, basically simulating blind-spot correction that we naturally have, and applying it to things like support beams in a car, seems like it should be doable and useful.

though keeping it safe when such mechanisms FAIL is where it could be really tricky.

3