Viewing a single comment thread. View all comments

h20ohno t1_iuv3a3x wrote

An idea I had is for some sort of contract system you can sign with an ASI, in which you can agree to some rules and limits before moving to a different region, for instance you could specify that you aren't allowed to exit a VR sim until 2 years have passed inside the world (Or if a condition is triggered), or maybe something more abstract such as "If I end up in a hedonistic cycle where I stop doing productive things, please intervene"

And in these contracts, you would have to sign off on a number of laws that the governing ASI also brings to the table: "No killing or torturing conscious beings" or "If you want to create a conscious being, they are immediately subject to all human rights and can leave the simulation whenever they wish"

Any thoughts on a system like this?

3

turnip_burrito t1_iuv45tj wrote

I agree with this contract idea. It is a good proposal to protect yourself and others from your own actions. Very sensible.

If we ever reach a point where we know how to artificially create conscious beings, then we should (as you've pointed out) have a set of rules to prevent abuse. To add something new to the discussion: there is also a possibility of material or energy resource shortages (resulting in lower quality of life for you, others, or the new beings) if too many conscious beings are allowed to exist at one time, so it will need to be regulated somehow.

3