Viewing a single comment thread. View all comments

User1539 t1_ixz9u3x wrote

Yeah, I'm a developer and my 13yr old daughter, naturally, does some coding and stuff and people ask her if she'll be a developer like I am when she grows up.

Honestly, I think I'm the last generation of developer. Sure, there will be technical people who create things in the future. But, it won't look like what I'm doing, and it'll probably involve a lot of very high level tools that handle low level work.

I expect my job to be obsolete before I retire.

19

TinyBurbz t1_ixzzw4n wrote

>Honestly, I think I'm the last generation of developer.

Python is a standard course at Chinese elementary schools, development isn't going anywhere.

4

User1539 t1_iy0fzjt wrote

What use will that be when you can describe your needs in a natural language to an AI and it will create the application for you?

At that point, why have specialized applications at all? For instance, why would you have a web page that displays a student's grades when an AI can simply tell you what those grades are? Why have an application for entering them when your AI can do that automatically? Why have a map application when your AI can simply tell you where you are, and where you're going?

The AI becomes not only the creator of the interface, but the entire application as well.

What use is Python in that world?

9

TinyBurbz t1_iy0imex wrote

>What use is Python in that world?

I'll let you know when we get there.

3

User1539 t1_iy0mnes wrote

I suspect it'll be like Latin. Not used, or useful, but still practiced for fun.

1

TinyBurbz t1_iy2hj83 wrote

>Not used, or useful

Latin is used heavily in our modern vernacular.

1

User1539 t1_iy3ce5s wrote

No on is speaking it as a language. I'm sure people will still describe things to AI in terms from computer science.

0

TinyBurbz t1_iy4b5xy wrote

>I'm sure people will still describe things to AI in terms from computer science.

Unless they are building the AI, or the language the AI is built on, or building a dataset, or creating codeblocks for AI libraries.

1

User1539 t1_iy4itij wrote

I'm not sure what you're getting at, I mean, obviously we're talking about after a certain level of machine learning has taken place, but we're already seeing experiments in self-coding AI, and copilot, and I'm sure some people using copilot are working on machine learning algorithms.

Besides, a lot of machine learning requires use of massively parallel systems, like CUDA, that are only abstracted from libraries as it is.

As I've explained in earlier comments, we're already so far abstracted from the bare metal workings of a modern processor, we're closer to what I'm describing than what non-coders imagine we're doing.

We use increasingly high-level languages to describe behavior that's abstracted away by virtual machines and operating systems, and all of that is handled by layers of compilers and interpreters.

There's already very little difference to saying 'Hey, AI, I need an application that will take in a username and password, encrypt the password, and safely store both in a database for later', and a modern boiler-plate code generation system.

We're almost there for CRUD apps and basic web interfaces. We can explain the intricate pieces to something like coPilot already.

Tools like this already exist, already using current levels of machine learning to push us towards the next iteration of tools that will use more advanced levels of machine learning, and so on.

We probably won't be completely aware of when we've passed the threshold, because it'll look like just another really neat plugin for your favorite IDE.

1

TinyBurbz t1_iy4kx0v wrote

>There's already very little difference to saying 'Hey, AI, I need an application that will take in a username and password, encrypt the password, and safely store both in a database for later', and a modern boiler-plate code generation system.

That's the thing though. Why use an AI when all the AI does is spit out code from a generator and then adds whatever modifications you specified from a library of human code? "Self coding" AI aren't programming from scratch, they all use libraries. Why call it AI?

1

User1539 t1_iy4tr9y wrote

I think you're conflating two different aspects of the argument.

You seem to be suggesting that if the code produced is, ultimately, just adding, modifying, or using, existing codebases then it's not 'AI', or if it's not 'from scratch' then it's not 'AI'.

There's a few things to break down here, first the code generated isn't the AI, and if the AI is just stitching together libraries to achieve a goal, well, that's what humans are doing too.

Most libraries will be re-written, by humans, over time, because new languages are invented and newer design patterns are accepted, etc ... and those new libraries, right now, are being written with the help of machine learning.

So, the 'produced code' not being wholly original isn't really any different than what people are doing now.

The 'AI' part of the process is where the pattern recognition abilities of machine learning are leveraged to generate working 'code' from human spoken language.

A computer without a trained natural language processor couldn't be told 'I need a webpage, that you log into, that will display results of a test where the database of the results are ...'

So, you would tell that to a developer, and count on his years of experience to understand how to pull the results of the test into a database, write a simple application to provide some system of logging in, displaying data, etc ...

If a human were doing that, likely he would use something like Spring boot, to generate boilerplate code, then something like KeyCloak to handle the security features, and ultimately a front-end javaScript framework to handle displaying the data.

So, where the AI comes in, is that it can recognize what the human wants from a natural language description and build it without the need for any more input than a human would have to give.

We're almost there, too. We can already describe fairly low-level logic, like sorting through a set of data and retrieving a record based on criteria, then using that record to perform a task, with machine learning systems like copilot.

If we see a broadening of something like that, to allow for the high-level description of complex algorithms, it'll become the defacto standard for creating future AI, and that AI will just be turned right around and used on the problem of understanding natural language and generating code, like a feedback loop.

When the AI is good enough, I'm sure someone will say 'rewrite all these libraries, but find any bugs (and there are plenty), and fix them'.

Then we'll see the tables turn. We'll have AI using code written by AI, to produce applications as described to it from humans speaking natural language.

The compiler is already doing some optimization too. If you code something in a human readable, but ultimately inefficient, way the compiler will likely just re-organize that to be more efficient when it generates machine code.

A good example of where things may go is that AI is starting to find some interesting algorithms in pure math. An important one to pay attention to is matrix multiplication, because it's something that computers have to do all the time, and it's very tedious, and difficult to optimize. In general, there is one good way to do it, and that's what any human will code when asked.

However, under certain circumstances, for specific sizes of matrices, you can optimize the algorithm and save the computer a ton of resources.

Almost no developer, today, even knows these algorithms exist. They're basically an AI curiosity. Even knowing they exist, I'll be practically no one is using them, because the time and effort to study them, and code them, is more effort than the general performance gain from implementing them would be worth.

What we'll see, and are frankly already starting to see, is that an AI will recognize those rare, special, conditions under which it can optimize something, and will generate the code to do so.

So, it really won't be long before we see a re-implementation of a lot of those libraries and stuff.

Then we'll all be stitching together AI code ... except, probably not, because we probably won't be coding at all. We'll just be describing our needs in natural language, and the AI platform will do the development.

1

visarga t1_iy3l7zy wrote

> What use will that be when you can describe your needs in a natural language to an AI and it will create the application for you?

Same thing happened to learning English - it used to be the smart choice, but now translation software removed that barrier.

1

User1539 t1_iy3psgp wrote

Yeah, and the thing is, we're nearly at the natural language stage for computers to 'understand' what we're describing. On the other end of things, compilers are creating machine language that's so far removed from our 'code', that you couldn't recognize your compiled code anyway.

So, what 'programming' is, in 2022, is using a specialized language to describe to a compiler what machine code to produce.

If you just had a little machine learning on both ends of that, things would change dramatically. The 'code' you describe could be less precise (no more missing semicolon), and the machine code produced might be much more compact and efficient (recognizing specific cases of matrix multiplication that can be optimized).

We're already basically doing this, it's just one little step to say 'why can't I just tell the computer I need to read through all the records and get the one with the user ID I'm working with?'

With co-pilot you basically can say that, and as that improves there won't be a need for 'programming languages', just natural language descriptions of what each piece does, and then as AI gets better, those pieces will get bigger and bigger until you're just describing applications at a high level.

Eventually you won't even have 'applications', you'll just describe to the AI what you need, and it'll provide it in something it knows is intuitive to you.

2

Polend2030 t1_ixzx9e1 wrote

I read the same about truck drivers and selfdriving vehicles - but they are not that safe and they will not replace drivers in this decade. How good are ai tools comparing to developer

2

User1539 t1_iy0g5ni wrote

Eh, we'll see where both self driving and self coding are in a decade. I won't retire for 2 decades or more, so I'm not suggesting it'll happen tomorrow.

5

NefariousNaz t1_iy2r01e wrote

Is there going to be issue with gap in knowledge due to lack of low end work?

1

User1539 t1_iy3d99g wrote

Well, how may people write code in assembly now?

I can tell you that I used to do a lot of assembly. I even wrote an X86 compiler for fun, played with the Z80 to write Gameboy games, and did PIC Microcontroller stuff for a while when working on an engineering team.

I don't think I could write anything meaningful for any new processor, really. I could write enough to get a driver working or something, I'm sure ... but memory management and multi-threading?

Truth told, we already have that gap. No one is writing much of anything substantial without tools that handle the low end stuff. Most new languages don't even allow the developer to manage memory, and even when you do it's mostly 'virtual memory' as presented to the program by the Operating System, because the 'real' memory is being managed a layer below most running programs.

We keep abstracting things away, for different reasons.

Most developers have never written a simple 'hello world' in assembly, and even computer engineering students, who's entire purpose is to understand that level of computers probably haven't written anything that really uses the full feature set of a modern processor.

1