MarginCalled1

MarginCalled1 t1_jegzoof wrote

In my previous life I was an FRA certified Conductor and Engineer for Union Pacific. If FritzyBoy does anything it will be to remove the conductor and leave just the engineer and hope PTC doesn't wreck on every grade it comes across, that or ripping the train apart.

They were already trying to do that shit while I was with them and then they decided to go with this 'precision railroading' stuff and removed something like 16,000 rules, allowed train crews to shove without announced protection and a bunch of other shit.

I called all the train wrecks back in 2016 when they started whining about profits and stripping rules. Nothing that has happened is surprising to those of us that have worked in the industry. It's pure greed

2

MarginCalled1 t1_jaqe6yb wrote

We're also discovering new processes, materials, designs, and other factors that allow us to continue on the trajectory I mentioned. I'm actively involved in some of this work.

The primary issue is battery size and capacity, otherwise we would have robots such as those from Boston Dynamics already out completing limited work for us. As I mentioned batteries have been a heavy area of investment with a lot of advancements and options coming commercially in the next 2-3 years.

The secondary limiting factor is the ability of current LLM and AI programs. Also note that most labs have departments that use AI to help design, test and measure new products and services, and in some cases is able to write code based on a prompt and therefor as AI improves so does the technology that supports further advancement.

I'd be willing to bet my aforementioned numbers are very close estimates of where we will be at that time.

8

MarginCalled1 t1_jaqaokw wrote

Hardware is advancing at an exponential rate. Every 2 years, according to Moore's Law (historically accurate) the number of transistors on a microchip doubles resulting in your electronics being twice as fast to process new information.

At the same time software - more specifically AI - is advancing at a similarly exponential rate, doubling in ability/speed every 6 months on average.

Both of the above items are multiplicative to each other as they progress, resulting an massive jumps in processing power, and software improvements in short periods of time.

As an example if you look at some of the first console video games released (Mario, Duck Hunt, Transylvania, Excite Bike, etc) and then go search YouTube for "GTA 5 4k Ultra Hd Graphics" then click on the top one with a bike snippet and compare the graphics and depth of each it's nothing less than absolutely incredible.

Then throw in that in the same period we went from wired phone lines to a phone that can call, text, and surf the internet, and even speak to a program, tell it to create new art, stories, and recite the worlds knowledge to you plainly in your language and it will complete the entire process in less than 5 seconds.

I would say within the next 7 years we will have fully functioning human-like robots capable of most daily human tasks. I'd also guess that by this point a large amount of the human workforce will start feeling the effects of software eroding 'white collar' work.

The exponential nature of our advancement leads me to believe this is true. I would also like to note all the progress we are seeing in battery technologies and manufacturing discoveries. All three play a role with AI being the one that will most critically define the next phase of human life, whether we are extinct, in utopia or somewhere in the middle.

16

MarginCalled1 t1_ja1pce7 wrote

Microsoft is testing software within their 'Teams' program that will translate spoken language in real time between multiple parties.

I'd estimate that by 2025 ( 2 years from now ) human translators will start disappearing at a rapid pace. Call Center workers will also start seeing large layoffs due to AI at this point as well.

Source: I work in AI and have friends all over the industry.

19

MarginCalled1 t1_j9hp5hw wrote

I use ChatGPT to write all of my emails that require more than a line or two, then I go through and make edits as necessary.

We live in strange times, this is like someone in the late 80s writing an E-MAIL to address fatalities, they couldn't even write a normal letter and send it in the mail like everyone else??!

In a couple years we'll look back and nod at how dumb this is.

35

MarginCalled1 t1_j6jw5db wrote

I'd assume that your time-to-create would gradually become faster and faster with how quickly this particular technology is moving and with general hardware tech advancement as well.

You guys are on the cutting edge, some might say you are a little ahead of your time. Regardless it's welcome innovation and all I can do is wish you the absolute best. Fascinating stuff

2

MarginCalled1 t1_j6iu0ye wrote

I'd think the ideal way to handle this would be to create a large enough buffer, say a couple hours. It wouldn't technically be 'live' but very close to it and allow for 3d artwork generation.

Additionally you may be able to save previously generated content for reuse going forward, for example if I wanted to create a character named Daffy, instead of drawing him each time you could have AI generate him in every possible motion then refer to that going forward. That would save a ton of compute and shave a lot of time off the processing requirements.

1

MarginCalled1 t1_j31k47u wrote

I never understood why I couldn't use a calculator after I had learnt the basics of a topic. In our time I think we should be shifting from 'Know how to do everything in your head' to 'let's teach students how to critically think, and how to use tools that are available to find solutions in an efficient manner'.

I'd argue that traditionally it was very important for the student at every level to be able to articulate a topic in their head. Today however, I'd argue that out of High School students should have a basic understanding of topics, and have the knowledge necessary to utilize these tools to augment their own ability.

Getting into a Bachelor's degree or higher I think it would be necessary to start having a much more solid understanding of each topic of study.

Detractors would rightly say 'well what If the technology becomes unavailable via a storm/blackout or the internet is knocked out, or somehow otherwise becomes inaccessible' and I think that in our modern world if that happens everything is so integrated into technology that it'll grind to a halt regardless.

Another good point is that ChatGPT is still an early product. I don't advocate using it professionally in most fields at this point in time unless it's the topic of study. However, very shortly I believe the census will change. In the past two years progress has been mind-boggling.

Everything is reliant on computation in modern society, why not augment ourselves? I can honestly say that the vast majority of math I learned in HS I've never done again, and I work with a ton of data.. it's all done and calculated via software, and verified with a calculator, which is done mostly by programmers with tools of their own. (Though I admit this needs to be looked at, 41.32% of professional programmers had a Bachelor's degree and only 21% had a Master's degree (according to 'Statista' poll data from 2022))

I have several friends that work in corporate finance and while having a drink on my back patio over the past summer they had stated that they very rarely use anything beyond basic math, the rest is done for them by machines.

Or perhaps we need to rethink what High School is meant to teach. It seems our public education system, and way of teaching is stuck in the early 1900s while we've progressed tremendously. Though I admit, as my grandfather used to say, it's easier said than done.

Edited to add Statista poll, Link

5

MarginCalled1 t1_j1p8ia8 wrote

Exactly the kind of opinion that someone who isn't watching closely, or working in a related field would express.

This technology is advancing incredibly fast, we are at an exponential level where every 6 months the technology doubles in ability, which is faster than Moore's law by a factor of 4.

Gasparov and Deep Blue were only facing each other 23 years ago, and someone had to provide physical input of the moves that had happened. (For the younger readers: The original 2D Grand Theft Auto was made the same year, now go watch a 4k YouTube video of graphics modded GTA V)

Now multiply that pace by 4. Welcome to the AI decade.

Semiconductor, memory, and GPU companies have all started installing/using AI to help with chip research, design, materials, processing, factory/laboratory/fabrication physical designs and suggesting novel and promising ideas on all fronts.

More of any of those and AI power increases, which can drive faster, more precise research into AI continuing the multiplicative circle.

At a certain unknown point the AI will be able to read, write, bench, debug, and improve upon itself. These are all tasks that AI is doing at a lower level today. When AI can do all of this, we hit what many people suspect is the singularity, a point in time where we can't predict what will happen after due to the speed of technology advancement.

We are going to see some impressive technological progress in our lifetimes.

1

MarginCalled1 t1_istin3y wrote

I think we are going to see a lot more responses like your original going forward. Many people misunderstand, and undervalue the sort of work that's being done. In addition we are starting to get into the realm of nanofactories producing enzymes within your body, many people simply won't understand, or be willing to try to understand.

2