jedi_tarzan
jedi_tarzan t1_j9vg0ri wrote
Reply to comment by LettucePrime in Question for any AI enthusiasts about an obvious (?) solution to a difficult LLM problem in society by LettucePrime
> I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.
I read it all, though. You're probably tired of hearing it but I'm still sorry for your dad. That sucks.
Anyway... damn. Yeah. You've made excellent points.
I'm a technophile, I believe in progress, and I watch on the daily as outmoded corporate interests stymie or erase technical progress that could become cultural progress. I watch politicians flagrantly ignore scientific evidence for climate change or disease, while watching journalists encourage them.
So, perhaps much of my opinion surrounding AI is derived of those feelings and sympathies. But I think you're right on much, maybe all of what you've said.
So now, I don't know. I don't know the best way forward. I don't think AI is going away. Companies are working out tools to detect AI-generated text, but technological progress also demands the AI-generators get smarter.
Thank you for writing that all out, though.
jedi_tarzan t1_j9q6dhv wrote
Reply to comment by LettucePrime in Question for any AI enthusiasts about an obvious (?) solution to a difficult LLM problem in society by LettucePrime
I disagree with some of this. "The struggle is the process" wafts of "I suffered, so so should you."
If our tools and technology progressed past the point of a certain test being useful, we move on and make new tests.
The math comparison is not useless. No one thinks writing and arithmetic are the same, so pointing it out isn't moving the discussion forward. The core point of the comparison is that when technology can perform part of the process, we change what it is we care about teaching. I don't know about you, but essays were often basically take-home busywork.
As far as writing essays go, LLM didn't exist when I was in school, but CliffNotes did. Sparknotes did. Enough internet to plagiarize with some clever editorializing. "Academia" has always had this problem. Some students will learn to the degree that they need to. And what industries are harmed by students fudging their essays? What jobs?
Won't those jobs also have access to the same tools? I'm in a very high level technical field and I now regularly use llm tools to get me started on templates for yaml files, terraform modules, etc. If anything, learning how to use it will be the skill.
jedi_tarzan t1_j6jbxwx wrote
Reply to comment by LincHayes in Do you think ANY job is safe from AI within the next 50 years? by Aknav12
Well Crypto is sort of the stand-out there.
But the other two absolutely did happen. Amazon is still killing mom and pop shops. They're not dead, but they're severely reduced.
And Industrialization absolutely decimated the manufacturing industry. At least in the developed world.
So following that trend, will these AI models replace all humans? No, of course not.
But jobs that used to be covered by 10 copy-writers or whatever will instead be handled be 1 one guy who knows how to maintain the AI cluster. There will be reduction in once "safe" jobs.
jedi_tarzan t1_j6cbo2t wrote
Yeah, all 5 of the scientists on Earth are only working on AI.
That's how humans have always progressed through history. By collectively, unanimously working on a single invention or cause at a time. One single one.
jedi_tarzan t1_j5q4boi wrote
Reply to comment by ---nom--- in "By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it."- Eliezer Yudkowsky. by KiwiTechCorp
It's crazy how negative I've seen some comments regarding ChatGPT's accuracy.
Like they're disappointed it's "weak" AI, rather than correctly identifying it as an amazing demonstration of a very specific goal. That being language processing.
Like yeah, it's usually wrong at least a little. Midjourney gives people 8 fingers on each hand, ChatGPT gives k8s ingress configurations in a service definition file.
But that's a matter of use misuse and misunderstanding. Hopefully it doesn't impact the technology's growth and development.
jedi_tarzan t1_ja9o85n wrote
Reply to I have a high amount of anxiety surrounding the future of my job and AI by Business_Pin4533
The fear mongering is what you see with any new tech. I genuinely, to my core, believe that there is no near future where LLMs or AI replace engineers.
However, Engineers who make use of AI will replace Engineers who don't.
Look at something like Co-Pilot. The future is AI-assisted programming. We are not anywhere close to general AI that can actually code for a human. They hallucinate with regularity, and the more obscure the required knowledge, the more BS they make up to cover for it.
I work in devops, and the confidently wrong manifests it writes for Istio are... hilarious.