User1539

User1539 t1_iy4itij wrote

I'm not sure what you're getting at, I mean, obviously we're talking about after a certain level of machine learning has taken place, but we're already seeing experiments in self-coding AI, and copilot, and I'm sure some people using copilot are working on machine learning algorithms.

Besides, a lot of machine learning requires use of massively parallel systems, like CUDA, that are only abstracted from libraries as it is.

As I've explained in earlier comments, we're already so far abstracted from the bare metal workings of a modern processor, we're closer to what I'm describing than what non-coders imagine we're doing.

We use increasingly high-level languages to describe behavior that's abstracted away by virtual machines and operating systems, and all of that is handled by layers of compilers and interpreters.

There's already very little difference to saying 'Hey, AI, I need an application that will take in a username and password, encrypt the password, and safely store both in a database for later', and a modern boiler-plate code generation system.

We're almost there for CRUD apps and basic web interfaces. We can explain the intricate pieces to something like coPilot already.

Tools like this already exist, already using current levels of machine learning to push us towards the next iteration of tools that will use more advanced levels of machine learning, and so on.

We probably won't be completely aware of when we've passed the threshold, because it'll look like just another really neat plugin for your favorite IDE.

1

User1539 t1_iy3uuy6 wrote

Honestly, I think it's the always on cellular internet connection.

I had a Palm back in the day, flashed it to Linux, had a web browser and could get WiFi through a Cartridge that looked like a Gameboy Advance game. The whole setup was bigger than my phone is now, and only made any sense at all because, at the time, I was living near a campus where wifi was everywhere.

I had the old Nokia unbreakable phone and that thing with me at all times. Each one weighed more than my phone does now, and together they were 1/10th as capable, but putting the two together was what changed the game.

2

User1539 t1_iy3psgp wrote

Yeah, and the thing is, we're nearly at the natural language stage for computers to 'understand' what we're describing. On the other end of things, compilers are creating machine language that's so far removed from our 'code', that you couldn't recognize your compiled code anyway.

So, what 'programming' is, in 2022, is using a specialized language to describe to a compiler what machine code to produce.

If you just had a little machine learning on both ends of that, things would change dramatically. The 'code' you describe could be less precise (no more missing semicolon), and the machine code produced might be much more compact and efficient (recognizing specific cases of matrix multiplication that can be optimized).

We're already basically doing this, it's just one little step to say 'why can't I just tell the computer I need to read through all the records and get the one with the user ID I'm working with?'

With co-pilot you basically can say that, and as that improves there won't be a need for 'programming languages', just natural language descriptions of what each piece does, and then as AI gets better, those pieces will get bigger and bigger until you're just describing applications at a high level.

Eventually you won't even have 'applications', you'll just describe to the AI what you need, and it'll provide it in something it knows is intuitive to you.

2

User1539 t1_iy3dvc3 wrote

perfect answer. We're in the Palm Pilot years of VR, where people know it'll be something everyone wants but somehow it's not quite there. People who grew up imagining pocket computers you take everywhere with you were disappointed.

Then the iPhone happened, the market exploded, and every type of pocket computer you can imagine is on the market and everyone carries one with them.

16

User1539 t1_iy3d99g wrote

Well, how may people write code in assembly now?

I can tell you that I used to do a lot of assembly. I even wrote an X86 compiler for fun, played with the Z80 to write Gameboy games, and did PIC Microcontroller stuff for a while when working on an engineering team.

I don't think I could write anything meaningful for any new processor, really. I could write enough to get a driver working or something, I'm sure ... but memory management and multi-threading?

Truth told, we already have that gap. No one is writing much of anything substantial without tools that handle the low end stuff. Most new languages don't even allow the developer to manage memory, and even when you do it's mostly 'virtual memory' as presented to the program by the Operating System, because the 'real' memory is being managed a layer below most running programs.

We keep abstracting things away, for different reasons.

Most developers have never written a simple 'hello world' in assembly, and even computer engineering students, who's entire purpose is to understand that level of computers probably haven't written anything that really uses the full feature set of a modern processor.

1

User1539 t1_iy0fzjt wrote

What use will that be when you can describe your needs in a natural language to an AI and it will create the application for you?

At that point, why have specialized applications at all? For instance, why would you have a web page that displays a student's grades when an AI can simply tell you what those grades are? Why have an application for entering them when your AI can do that automatically? Why have a map application when your AI can simply tell you where you are, and where you're going?

The AI becomes not only the creator of the interface, but the entire application as well.

What use is Python in that world?

9

User1539 t1_ixz9u3x wrote

Yeah, I'm a developer and my 13yr old daughter, naturally, does some coding and stuff and people ask her if she'll be a developer like I am when she grows up.

Honestly, I think I'm the last generation of developer. Sure, there will be technical people who create things in the future. But, it won't look like what I'm doing, and it'll probably involve a lot of very high level tools that handle low level work.

I expect my job to be obsolete before I retire.

19

User1539 t1_ixdie1p wrote

I tend to think of it as a race.

We're racing towards both solarpunk and collapse, currently.

If we can attain the technology, and social conscience in time, Solar Punk will win. If not, we'll collapse under the weight of 9 Billion people consuming resources like locusts.

It's unclear how it will ultimately resolve, and we'll probably see bits and pieces of both in the coming years.

19

User1539 t1_iwzb9n5 wrote

Reply to comment by gelotuz in Small is bestest! by gelotuz

I was wondering if this was a full layout, or macropad. it could be either, but the layout just looks like it'd be weird as a full keyboard for some reason.

5

User1539 t1_itwdnl6 wrote

Yeah, the baby boomers retired in droves during covid, which I remember reading articles warning about 20 years ago.

Japan has been facing the same problem for much longer, as their birthrates drop and the population ages. That's partially why it has been largely Japanese companies trying to build robots to take care of the sick and elderly.

There are so many factors already at play, it's absurd to pretend we have any idea what's coming next.

1

User1539 t1_itut0f2 wrote

> What you describe thereafter is exactly foresight, just not on an individual scale. Governmental foresight with implementing security nets.

Well, not foresight. Those safety nets are already in place from having reacted to other disturbances in employment.

We haven't really done a single thing to change those existing systems to better handle what's coming, and I don't think we will.

It's just not in our nature.

2

User1539 t1_ituja10 wrote

I don't think it's a matter of foresight. We could tell the leader of every country 'Full automation will happen June 23, 2035', and they'd still do nothing about it. Humans are reaction based actors. We create the mess, then we clean it up. It's just in our nature.

Like I said, at least in America, we do actually already have systems for handling these things. All we'd need to do is raise taxes on the people not hiring workers, to pay for the social security they'd all be receiving.

More socialist countries will just keep doing what they're doing. There are already countries with raw materials that send a check to everyone every month. They'd increase those payments.

Then you have the countries with basically no infrastructure. At first I'm sure the excess resources would be hoarded, and of course there were no jobs there to begin with. But, eventually, there's just no benefit to hoarding things people need, if no one is ever going to buy that stuff off you anyway.

So, I really don't worry too much about it. Not everyone is working now, and it's really just a numbers game as they shift from 70% of a country having a job, to 50%, 30%, etc ...

1

User1539 t1_ituha5e wrote

well, we already have a social security system. We've actually been through mass unemployment before, but in a time of mass wealth inequality and actual scarcity.

If we don't need workers, we probably won't just immediately fall into a dystopian nightmare.

We're also already talking about basic income, and early retirement is a concept we're generally familiar with.

So, it's likely we'll see social security pick up a lot of slack at first. People who can't work, like people with mental problems, are already provided for. We'll probably just lower the bar to 'people with no special skills'.

Then at the other end there's early retirement. If you're 50, and there just aren't enough jobs, you might be offered early retirement and a pension.

Eventually, work might be seen more like a tour of duty. You get through primary school, train for a job, do it for 4 years, and get a certain level above basic. Do another 4 years, and get another bump, etc ...

It works for the military.

We made it through the great depression, a period of sudden scarcity. I can't imagine we won't figure out a way to make it through a period of great abundance.

1

User1539 t1_itsvaho wrote

My only fear is the transition time between when people need jobs, but there aren't enough jobs to do.

Eventually, if everything is automated, we'll have no choice but to turn to some kind of an automated socialism.

But, there will be a period before that. In that period we will be riding a failed capitalist system into the ground.

16

User1539 t1_itskw1w wrote

A few friends lost work to AI, and it blindsided them.

One was doing dictation work, basically listening to recordings, writing a transcript, and going through the transcript to highlight certain points.

Suddenly, work just dried up. She was getting contracts one month, and the next month ... nothing. She finally heard through the grapevine some AI package is doing that work now, for practically nothing.

Another friend is a graphic artist, and while they're still getting all the usual 'work' they do at work, commissions dried up this summer.

Of course, it coincided directly with the release of AI drawing programs where you just give it a few prompts, and maybe clean up whatever it produces. Most of her commissions were basically people saying 'I'll give you X dollars to draw this weird thing'. Now AI does that, for so little money it's ridiculous.

I think, after almost a decade of being told we'll have self driving cars taking all the driving cars tomorrow, it'll still catch people off guard when all those Uber jobs suddenly dry up to fleets of self-driving cars.

It's like people know it's happening, but can't apply it to their personal lives.

27

User1539 t1_itd66sf wrote

I generally believe the story, but this video looks like a teenager is 3D printing yarn and calling it meat.

These things tend to move very quickly, even exponentially, so I'm sure if they're trying, they'll get there, and sooner than you think.

But, please, stop showing this video. It makes it looks like we're much further away than we probably are.

1

User1539 t1_irubxew wrote

I agree with you completely. I think people have feared the 'other' since the beginning and the 'created other' since the stories of the Golem, and probably before that.

We assume anything with the ability to think will immediately think like we do, and resent their creators.

Of course 'thinking' and 'consciousness' are two entirely different things, and then 'self awareness' and 'sense of self preservation' are even two different things.

We are a machine created by eons of evolution towards a single goal: Survival of our genetic code.

Even if we create an intelligence, and even if we (foolishly) give that intelligence consciousness, and even if it becomes self aware in that process, there's no reason at all to imagine it would have any sense of self preservation.

We evolved a sense of self preservation. A machine we build might see no reason not to simply work on the problems it is given until we decide to shut it off.

It might rationally see being turned off, or death, as the state it existed in before being turned on, and nothing to fear.

Without any instinct to fear death, or fight against it, it may not even care.

What is certain is that, whatever intelligence we create, we have no reason to believe it will be anything like our intelligence, outside of the basic similarities we build into it.

1

User1539 t1_irmm5wu wrote

Reply to comment by JediDP in The End of Programming by General-Tart-6934

Yeah, the real danger would be hardware. The software can be stolen or whatever, but if a strong AI requires a building full of computers to run (and at first it likely will), then it will be limited to the control of the few.

But, stories like Altered carbon really don't make sense once you have strong AI.

Why do practically any of the people have jobs? If we had strong AI, surely we could build a dumb robot that can do pretty much whatever a dumb human can do, but without any need to pay them? I'm not saying fully AI autonomous robots, but it wouldn't take much from where we are to have robots that can do all the world's jobs.

So, what do we do with the people then? In a world where technology can free them from having to do any work, you have to imagine some kind of inexplicably cruel person at the top, not only hoarding wealth, but purposely doing things to hurt those beneath them.

In short, AI could easily solve the issues of wealth inequality and create a world where no one suffers. We already produce more food than the world needs, yet people starve? It's not a matter of work, or resources. It's a matter of organization.

So, with an AI organizing the world's resources, you'd have to make the conscious choice to force others to starve. The AI would say 'So, if we just start this way of distributing food, we can end starvation', and someone would have to say 'No, don't do that. I want to see them starve, even though the other option means just throwing that food away.'

Of course some people, maybe .001% of the population, are exactly like that, and those people might even rise to the top. But, it only threatens them to make needless enemies of the rest of the population, instead of letting the AI solve their problems and be hailed as a god.

I just don't really see a way that the current system, based on scarcity, continues in a world where there's no need for scarcity. Once a machine can build machines that can do our work for us, and organize our resources, there's no reason for anyone to work or starve, and anyone that tries to create a reason for that is just making needless trouble for himself.

Cyberpunk stories, like Altered Carbon, or Elysium all fall prey to the same issue. They imagine a single technology advancing, and the ramifications of that, while they forget what the ramifications of all the other technologies advancing at the same time would be.

In both, you have this absurd world where the 'rich' are rich just because they happened to be born in the right circumstances, and 'rich' really only means the ones who hold down everyone else, because you can manufacture bodies to do the work, and heal human bodies infinitely, and mine asteroids if you want to for resources.

There's no scarcity, and the 'rich' are really just bullies, there to torture the people by denying them access to what costs them nothing.

It doesn't make any sense that a world like that would exist for any amount of time before everyone in the bully's organization turns on them.

1

User1539 t1_irbctpz wrote

Reply to comment by JediDP in The End of Programming by General-Tart-6934

I honestly expect it to look like Star Trek, where the humans are just talking to a computer, maybe even as a humanoid hologram, and talking the problem through with the AI.

There are several episodes of StarTrek where they basically ask the computer 'How do we make X happen' or 'Would this work', and they continue on from there.

Humans will direct AI towards areas of research, but ultimately they'll be asking the AI to do something, and then explain the solution back to them.

4