Comments

You must log in or register to comment.

tinyogre t1_j3cgh2u wrote

The current version is funny to me. I did some tests and got reasonable looking code. First time I tried to use it for something real, but still pretty simple, it produced completely reasonable looking code. Everything made total sense and it even explained it well.

I put it in my actual project and it turned out the APIs it was trying to use were completely fabricated. No bearing on reality at all. I went back and told it so and it apologized and gave me a different version using a different set of non-existent APIs. Gave up and did it myself after all.

I think the APIs it wanted me to use would have been better than the ones that actually exist, for my purpose at least. But they just don’t and it really underlined the current weaknesses of the platform for me. In code as well as natural language, it’s an extremely good producer of bullshit and only marginally good at producing useful answers.

9

DukkyDrake OP t1_j3cwfmw wrote

All known existing AI tools doesn't really understand anything, this tool produces probabilistic text. That's why it won't do your work for you, it can't produce precise and dependable results. It will make you more productive as a programmer and is incapable of directly replacing what a programmer does.

5

blueSGL t1_j3du306 wrote

You can bet dollars to doughnuts that chatGPT is being run against real environments in training.

You know how it gets things wrong, and you need to keep prompting it then eventually it gets the thing correct?

That happening at scale.

Everything being recorded and every test case where it finally generates working code, that's a new piece of training data.

With just the current dataset and ability to feed known good answers back in this could bootstrap itself up in capability.

But of course it's not just using the data that's being ground out internally, it's also going to be training on all the conversations people are having with it right now.

2

gay_manta_ray t1_j3g2rx2 wrote

you can get good answers if you ask it to refactor the code repeatedly, and often the comments on the code (if you ask it to provide comments) are accurate after a certain point. the idea that this will replace programmers is comical, because you have to be a programmer to understand the code, understand why it does or doesn't work, understand what to ask chatgpt to refactor, etc. you have to already be a programmer to utilize chatgpt to program. that's what people who don't program don't seem to understand at all. it will be a useful tool as it improves, and will make programmers more productive, but it will not replace programmers.

1

just-a-dreamer- t1_j3c8hfr wrote

Working hours saved is programers fired eventually.

8

DukkyDrake OP t1_j3cam5e wrote

The cheaper useful goods and services becomes, the more those goods and services are consumed.

3

just-a-dreamer- t1_j3cay8n wrote

It is fantastic...or terrible. Depending where you are.

I am curious what people will expect of government in the future. And how they relate to each other in terms of status and wealth.

If the predictions are true, working hours will decrease dramaticly.

1

DukkyDrake OP t1_j3cdyvt wrote

It's almost always white-knuckle terror for the individual.

As someone who does old school automation, it's the bespoke engineering costs why we don't have a more technological society. The return on investment simply isn't there to justify crafting solutions for the vast majority of economically valuable tasks. Renting the productive time of poorly educated humans is still the gold standard.

These tools will allow tapping a much larger pool of lower skilled programmers as well. That lower cost basis will bring into reach the next larger level of low hanging fruit to partially automate.

8

just-a-dreamer- t1_j3cf8g8 wrote

How will programers react then? If you spend years in college, take out student loans, spend a career in developing specialized skillsets,....all for either being replaced or remaining at a low pay grade.

1

DukkyDrake OP t1_j3cz0vt wrote

How have people reacted to similar threats to their profession since time immemorial. How are artists currently reacting to their impending existential commercial threat. They will react badly and it's understandable, but it ultimately doesn't matter on the macro scale. Expecting the world to slow down or make a special exception for you isn't reasonable, they will all do what others have done in the past; they will all accept it in the end because there is no recourse. Society will be better off in the end, well, assuming the cultural ethics of your local society has a fair amount of humanism at its core. If it doesn't, a lot of people could be in for some difficult times.

4

just-a-dreamer- t1_j3d2ekh wrote

I predict some professions will rush to seek government intervention to stop AI automation in their field. Lawyers and medical professionals especially. They will demand more board licensing procedures to be allowed to engage in their respective professions.

Programers could also argue the government must step in and put regulations in place to protect their status. Creating mandatory job possitions like, let's say AI controller.

An age of big government might be on the horizon where everybody is demanding protection and job security.

2

DukkyDrake OP t1_j3d51ej wrote

>I predict some professions will rush to seek government intervention to stop AI automation...

I expect they will, and some will achieve success even if it's on the local and not national level. It won't matter. It's a big world, fear of losing out to your economic rivals will fuel the march progress.

2

just-a-dreamer- t1_j3d63tn wrote

Do you believe semi socialist economies like China will fare better with AI automation?

They have ways to direct the labor market western powers are not used to. They are also experienced in creating government jobs from scratch, useful or not.

Telling a highly paid respected white collar professional he is not needed anymore is one thing, figuring out what to do with such displaced person is a problem.

1

DukkyDrake OP t1_j3dqgqx wrote

I see very few signs of economic socialism in China, I think it's mostly lip service at this point. The only thing they have going is they don't have the visceral distaste for the idea, at least at the political level.

For a supposedly socialist country, the avg citizen over there doesn't appear to value the idea of the public commons at all. If it's not their property, they just don't care; perhaps that's a symptom of socialism itself. It manifests as extreme self-centeredness/selfishness, sometimes I think they have more in common with the outlook of the most militant US conservative.

The US doesn't really do industrial policy. "Not picking winners and losers" Existing businesses don't want to be displaced, and the new winners don't always have the same world view as the traditional winners. Politics always goes first.

If handled badly, the coming transitions could swell the ranks of the homeless until it threatens to destabilize a country. Authoritarian countries can remain stable with a high degree (>50%) of crushing poverty, not sure if an indirect democracy would fall apart before reaching such a state.

Do nothing politically and Johannesburg could be the future. >The Economics of Automation: What Does Our Machine Future Look Like?

2

just-a-dreamer- t1_j3dvnqw wrote

Honestly, I wouldn't bother about the lower classes for they will always get crushed economicly, at least that's what history shows. A sad reality of life, I wish it were different.

The problem is not the homeless in my opinion, but college graduates from middle class background. What will happen when 50% of college graduates don't find a job within 5 years? How will all these young ambitious 20 something react?

China has put down mass demonstrations of students in 1989 with military force. It was not the lower classes that caused trouble, it was students and college graduates.

Interesting times ahead, I think governments will expand to absorb some of the surplus labor to keep enough people loyal and calm.

1

DukkyDrake OP t1_j3e8p5k wrote

There are no special people, future homeless are just college grads without a job after their parents are no longer able to make their own house payments. No different than the reasons why there are currently doctors and PhDs driving taxis in Cuba. Degrees are only valuable if someone with capital values such qualifications.

>governments will expand to absorb some of the surplus labor to keep enough people loyal and calm

Why spend money on "make work" when gov can mitigate their future college grad uprising by ensuring ample supply of cheap drugs and video games.

Best to be stable before things goes south. This is all just idle speculation; everything could easily turn out just fine. No real way to know where things are headed. I usually bet on and prepare for the default outcome, what will happen if there are no special actions taken to ameliorate foreseen difficulties. That's usually the cheapest option at any given moment for gov, do nothing and hope the problem goes away.

2

just-a-dreamer- t1_j3ef3uq wrote

Well, I have read demographic data that is at cross purpose with the fears about AI automation.

Apparantly China has a gigantic problem with aging population, as do many other countries in southeast asia.

The US and europe also face problems. I don't know, maybe AI automation will come just in time to save the day as more and more people are about to retire.

Or maybe all countries are in trouble. Depends how deep and fast the transformation happens and how governments will react.

1

DukkyDrake OP t1_j3fi7n6 wrote

We are approaching the age where the problem will be "what to do with legions of unwanted workers that serve no useful function". The problem of "not enough young workers and consumers" is only a problem if the timing is wrong. If AI isn't reliable enough to function in the real world in the next 20-30 years, that probably means it's a much harder nut to crack than assumed and it's not going to be solved before that demographic prob becomes a massive drag on global GDP.

1

onyxengine t1_j3cw0rt wrote

Haven’t seen this expressed better by anyone else

1

ReignOfKaos t1_j3dqq4e wrote

Historically better abstractions have not lead to less developer demand, quite the opposite. It’s what programmers have been doing since the invention of programming.

1

Laagar t1_j3e6chf wrote

ChatGPT finally allowed me to save time at work. Github Copilot led to an hour or so wasted with missing parentheses in JavaScript code, when it's mistaken suggestion led to otherwise unnecessary debugging.

I've been following AI in programming pretty closely, so I realised that a certain situation at work, which required modifying a Bash script, was well suited for it. ChatGPT solved the issue on the first try, so it saved me a hour or so, leading to a net positive hours saved on the whole.

Still there needs to be a much greater context awareness, on the level of a large code-base of hundreds of thousands of lines, before it can genuinely threaten my job, even as just a junior software developer.

2

Lawjarp2 t1_j3cv7mt wrote

Yeah but I can't put any code I write into chatGPT. Because it could eventually lead to copyright violations and not to mention it's like leaking private code to openAI. I guess they will eventually provide a more private version of it.

Edit : Also, for now, it provides buggy code that sometimes has serious logical flaws. So the author is speaking utter garbage by saying it makes him 10x already with GPT 3.5.

A true 10x'er would be reliable in complex scenarios. This gives the exact opposite when working with.

1

DukkyDrake OP t1_j3d3ff9 wrote

The author isn't expecting it to produce fully formed reliable code for complex scenarios, that's the main point of his post. No existing AI tool is reliable, no one currently knows how to make one reliable.

10x productivity gain doesn't mean the AI tool is doing his work for him and he doesn't have to fix up buggy code.

1

Lawjarp2 t1_j3d736z wrote

Fixing up buggy code takes more time than writing new one. Especially when you write less and less on your own. I'm not saying it's not or rather won't be useful very soon. But what I'm implying is that the article is the bullshit. Because had he really used it in a real life job he would know it's not good enough for anything other than most basic stuff and even then it's unreliable.

The 10x is just a term used to denote high performing individuals who can do things others can't. Adding chatGPT, as it is now, to a normal person won't make that person a 10x'er.

1

jackflash223 t1_j3fgpz8 wrote

I completely agree. Fixing bugs absolutely can take more time than producing new code. The time requirement to fix bugs also scales more dramatically the deeper the bug is nested inside of the code base.

I can see where maybe ChatGPT would be 10x faster than searching google for answers when someone has a question, but I'd stay far away from trying to implement most of what it spits out into a production application. That goes double for enterprise level applications. Especially if the person generating it doesn't understand how to produce that code on their own.

1

[deleted] t1_j3ezwxe wrote

[deleted]

1

nutidizen t1_j3j3e9w wrote

I'm. Very large scale software I'm developing on day to day basis.

> ChatGPT is so so far away from taking developer jobs.

ChatGPT is. And it will never do that. Next AI might be closer than ChatGPT and it's not far away.

> It's not even really useful at all right now

I can already use it to explain some regex, outline some WinAPI calls, write documentation, do some repetetive code tasks.

> Nobody is 10xing their output because of ChatGPT unless they weren't a very good developer to begin with

I agree.

> If you actually pushed the code that it generated, your company would be out of business within the week.

You're completely disregarding the extremely fast progress that is right behind the doors...

2

[deleted] t1_j3j54fo wrote

[deleted]

2

nutidizen t1_j3jktgk wrote

> Everyone here also seems to think that if you scale up LLMs enough you suddenly get an intelligent system

Maybe not. But you will definitely get a system able to create software much more efficiently than any human.

1

DukkyDrake OP t1_j3fgz7g wrote

> Now imagine connecting it to enterprise software.

Something no one is doing, but some fear it's just about there and it's not.

1

nutidizen t1_j3j3kyy wrote

Most software developers I speak to just say it's a gimmick, blah blah blah. Completely ignoring the exponential technological growth and the law of accelerated returns. This shit is just a small preview of what's about to come.

1