jedi_tarzan

jedi_tarzan t1_ja9o85n wrote

The fear mongering is what you see with any new tech. I genuinely, to my core, believe that there is no near future where LLMs or AI replace engineers.

However, Engineers who make use of AI will replace Engineers who don't.

Look at something like Co-Pilot. The future is AI-assisted programming. We are not anywhere close to general AI that can actually code for a human. They hallucinate with regularity, and the more obscure the required knowledge, the more BS they make up to cover for it.

I work in devops, and the confidently wrong manifests it writes for Istio are... hilarious.

7

jedi_tarzan t1_j9vg0ri wrote

> I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.

I read it all, though. You're probably tired of hearing it but I'm still sorry for your dad. That sucks.

Anyway... damn. Yeah. You've made excellent points.

I'm a technophile, I believe in progress, and I watch on the daily as outmoded corporate interests stymie or erase technical progress that could become cultural progress. I watch politicians flagrantly ignore scientific evidence for climate change or disease, while watching journalists encourage them.

So, perhaps much of my opinion surrounding AI is derived of those feelings and sympathies. But I think you're right on much, maybe all of what you've said.

So now, I don't know. I don't know the best way forward. I don't think AI is going away. Companies are working out tools to detect AI-generated text, but technological progress also demands the AI-generators get smarter.

Thank you for writing that all out, though.

2

jedi_tarzan t1_j9q6dhv wrote

I disagree with some of this. "The struggle is the process" wafts of "I suffered, so so should you."

If our tools and technology progressed past the point of a certain test being useful, we move on and make new tests.

The math comparison is not useless. No one thinks writing and arithmetic are the same, so pointing it out isn't moving the discussion forward. The core point of the comparison is that when technology can perform part of the process, we change what it is we care about teaching. I don't know about you, but essays were often basically take-home busywork.

As far as writing essays go, LLM didn't exist when I was in school, but CliffNotes did. Sparknotes did. Enough internet to plagiarize with some clever editorializing. "Academia" has always had this problem. Some students will learn to the degree that they need to. And what industries are harmed by students fudging their essays? What jobs?

Won't those jobs also have access to the same tools? I'm in a very high level technical field and I now regularly use llm tools to get me started on templates for yaml files, terraform modules, etc. If anything, learning how to use it will be the skill.

2

jedi_tarzan t1_j6jbxwx wrote

Well Crypto is sort of the stand-out there.

But the other two absolutely did happen. Amazon is still killing mom and pop shops. They're not dead, but they're severely reduced.

And Industrialization absolutely decimated the manufacturing industry. At least in the developed world.

So following that trend, will these AI models replace all humans? No, of course not.

But jobs that used to be covered by 10 copy-writers or whatever will instead be handled be 1 one guy who knows how to maintain the AI cluster. There will be reduction in once "safe" jobs.

2

jedi_tarzan t1_j5q4boi wrote

It's crazy how negative I've seen some comments regarding ChatGPT's accuracy.

Like they're disappointed it's "weak" AI, rather than correctly identifying it as an amazing demonstration of a very specific goal. That being language processing.

Like yeah, it's usually wrong at least a little. Midjourney gives people 8 fingers on each hand, ChatGPT gives k8s ingress configurations in a service definition file.

But that's a matter of use misuse and misunderstanding. Hopefully it doesn't impact the technology's growth and development.

8