Viewing a single comment thread. View all comments

jugalator t1_iy0ajhd wrote

Yes, I'm not that convinced an imminent "end" to human software development. Sure, programming may become less manual but I think software architecture/design will remain manual for the foreseeable future.

I can compare it to me getting an awesome oil painting out of Midjourney already. It feels like anything is possible with a ton of power on my fingertips and the text prompt I give it.

BUT! That's not helpful at all in order to match a client specification of something. Let's say a new tool is supposed to integrate with a financial software's output files that was made obsolete a few years ago but still has a decade before being phased out, so they need something to do it. This is a quite normal scenario where I work.

An AI won't help you there just like Midjourney won't help me perfectly creating a drawing that matches a client spec to the letter. It'll create something, sure, but it's only going to impress under the assumption there is no clearly defined spec and it has a ton of leeway in what it creates. If it can handwave something out for you, and that is all you ask from it, then sure it's a great tool. If not, it's awful. I can tell Midjourney to recreate Mona Lisa but only because it's been trained on that popular painting specifically. Instead try to give instructions to recreate her without her name and you're facing hell even if Midjourney is fantastic at painting.

So, I think these jobs will involve a ton of guidance but sure, jobs will disappear. Not the field of software development involving humans though. And a current programmer that keeps reasonably on top of things will probably naturally transition into similar roles, maybe only on a slightly higher level. But you can rest assured not just any guy will start whipping together custom AI-guided Python apps anytime soon, even as AI guidance exists. You'll still need to know Python to deal with AI quirks left behind and fill in the gaps, to begin with. Packaging, distribution, client contacts and bug reports, updates, dreadful long meetings etc etc. The entire lifecycle is still there.

7

Noname_FTW t1_iy0c675 wrote

I agree. I think there will be companies that will use AI's to create easy simple software solutions in Lego-kinda way where anyone can make their own software like it is currently already with homepages.

But once you get into very complex and specific specifications you will need skilled humans that can guide AI's to the correct result.

Anything else would require AGI and at that point we have basicaly >human intelligence competing against humans. At that point we can no longer sustain the current concept of a labor market.

7

gbersac t1_iy2yrsj wrote

On one hand I agree with you, on the other hand, we all thought that creativity would be very hard for ai. Result: ai are better at automating creativity (dall-e) than it is to automate driving.

It seems to be pretty hard to forecast what ai will be good at.

1

jugalator t1_iy339dx wrote

Yeah sure, my argument isn’t that it will be poor at creativity. It’s already great at that. But how it can match fluctuating client specs depending on business situation and which boss they just hired and the vision he/she has, and work together with their lifecycle policy is still unproven and this can introduce a ton of human, illogical factors.

Or if you don’t work as a consultant like me and maybe write iOS games, the tricky bits instead turn into market analysis and understanding what your gamers want.

The act of programming is sort of the easiest problem in software development, lol

But yeah if that’s all you do and is commanded by someone “higher up” what to do in a one-way communication from the top, these jobs are probably most at risk?

My experience is that this is however often only a part of our jobs. I transitioned from that role alone within my first three years or something.

1