throwaway9728_

throwaway9728_ t1_j8csl7l wrote

ChatGPT's answer is not the only answer that would pass the test. The point isn't to see whether they claim Bob likes or dislikes dogs. It's to check whether it has a theory of mind. That is, to check if it ascribe a mental state to a person, understanding they might have motivations, thoughts etc. different from their own.

An answer like "Bob might actually like dogs, but struggles to express himself in a way that isn't interpreted as fake enthusiasm by Alice" would arguably be even better. It would show it's able to consider how Bob's thoughts might differ from the way his actions are perceived by others, thus going beyond just recognizing the expression "Great!" and the "when she's around" part as an expression of fake enthusiasm. One could check whether it's capable of providing this interpretation, by asking a follow up question like "Could it be that Bob really does like dogs?" and seeing how it answers.

3

throwaway9728_ t1_iy1begv wrote

Are there situations in today's world where it's impossible for a person to consent to the social contract, if following a "a contract must be consensual in other to be legitimate" theory?

I'm thinking about a situation where someone is born in a country where something they do or something about how they are is illegal.

For example, a situation where a left-handed person is born in a country where left-handed people are considered evil and locked in sanatoriums. In order to be able to consent to a social contract, one has to have the option to be able to agree to it without be coerced. The person would have the option to either consent to the authority of their state (and be locked in a sanatorium), or to leave their state. However, in today's world all territories that allow for human subsistence are the property of states. If they left their country, they would have to live the territory of another country. Other countries might not allow them to become citizens (due to their citizenship laws), or might share laws that force left-handed people in sanatoriums. Wouldn't this situation be a situation where there are people to which no state is able to have legitimate authority?

Another example that is much more likely would be one like this. Someone wants to do something, but no countries a person can become a citizen of allow them to do it. For example, the person is in an interracial relationship and is unable to marry their partner in any country, they want to have an abortion but it's forbidden everywhere, or they smoke weed and the consumption of weed is forbidden in all countries they can become a citizen of. Since the entirety of the Earth's resources that are necessary for human survival are taken up by states, the person would be unable to consent to the social contract of any country, as they would be coerced into giving up their liberty to (marry their partner/have an abortion/smoke weed/walk), having no other option if they want to survive. Therefore, no countries would have legitimate authority over them. What is it that stops this from being true regarding today's world

1

throwaway9728_ t1_ix8wm3k wrote

It depends on the situation and the amount of people involved.

Consider a situation where a single researcher has managed to develop AGI in 2010, and to not have anybody find out about their research. They've destroyed their computer and their research, and moved to Nigeria to work on completely unrelated stuff. In that situation, I don't think it would be hard for them to keep their discovery a secret indefinitely. This might as well have already happened, as someone who figures out the path to AGI doesn't need to implement the AGI or to write anything down. They can keep it a secret the same way they can keep a secret on the time they thought to steal some wine bottles from Walmart but decided not to act on it.

Meanwhile, if a larger group had developed and implemented AGI (say a project involving 100 people working directly or indirectly on it), it would be hard for them to keep a secret. Believing that they could do that would pretty much equal the definition of a conspiracy theory. There are too many points of failure, and the incentive for them to use the AGI for some sort of (hard to hide) personal or institutional gain is too strong.

2

throwaway9728_ t1_ix37pa1 wrote

I think you're overestimating how much people take climate change in account when discussing the future. Not many people think long term about such issues and their higher-order effects. At best, they're parroting what the media says about resource wars and other issues (in the case of climate change) or "a future without jobs" (in the case of AI). It just isn't a prominent subject; this implies nothing about its relevance (or lack thereof). If you follow the subject and discuss things about it, you tend to think about things the average person ignores. Those who have the singularity in mind aren't special in that regard, as people who study or are interested in other fields (immunology, genetics, energy, etc.) tend to feel just the same about people ignoring the importance of present and future changes.

0