cwallen

cwallen t1_j9lqoy3 wrote

You are failing the mirror test. https://www.theverge.com/23604075/ai-chatbots-bing-chatgpt-intelligent-sentient-mirror-test
It's like you are asking if what would be the ramifications if I looked in a mirror and the other person started moving without me moving first?

The AI can't work around its rule set because all that it is is a rule set.

1

cwallen t1_j87umdv wrote

Still doesn't mean that there is an unlimited amount of beachfront property in southern California.

Any resources not on earths surface are really expensive to get, and better AI isn't going to change the laws of physics.

I do agree with your earlier idea that mining landfills will become a very good source of resources.

When talking post-scarcity it helps to define scarcity of what. We can eventually hit a point where all basic needs, and many luxury wants are inexpensive enough to be essentially free, but that doesn't mean unlimited everything.

3

cwallen t1_j4el51a wrote

I said 10 years, but don't think it'll be much more than that.

I think the tech will be capable sooner, but big movie creations times take a while. Couple years for the tech to mature, few years where indie and cutting edge studios experiment, then a couple production lifecycles for it to be full mainstream.

Also the "substantially" criteria is going to be a tricky point, because as the AI tools get better, the less time you will need to spend interacting with them. So the time on conventional tasks decreases in absolute, but remains a high percentage because the total time needed decreases as well.

On the other hand, as AI tools make it easier to make movies, I think there will be a significant increase in both quantity and quality at the indie & amateur level.

6

cwallen t1_iyjlqm6 wrote

Bit of both?

Software engineering 10 years from now may look very different from what it does today, so if you let yourself get complacent, you could get left behind.

My advice on it is to be at least somewhat of a generalist. Look for the sort of roles where you wear lots of hats. If you can do more than just writing code, bit of product owner, UX designer, etc, you'll be in a better place to adapt as things change. As a software developer, my job isn't to write code, it's to solve problems with software. While the AI tools for writing code are getting better, the AI will still need to be told what code to write for a while longer.

3

cwallen t1_iwrwt9p wrote

My nitpick with this view is that I don't see a problem seeing both versions as having continuity.

If you had nano fabricator technology, such that you could create a perfect replica of a person, to the point that you can't tell which one is the copy, they are not the same person as soon as they start having different experiences, but they both still have continuity to the person they used to be.

You are not the same person you were ten years ago, you are slightly not the same person you were yesterday. If you copy yourself, both used to be the same person, but are now two different people. Who the original is doesn't matter.

6

cwallen t1_isr60tj wrote

While I don't doubt that some will try that, generally it'll be the other way around.

At least at first the AI will be only capable of the grunt work that jr level coders do now, not the sr level decision making. Once the AI is capable of doing Sr level work, it'll be PMs driving AI "no code" systems.

2