Viewing a single comment thread. View all comments

EnzoTrent t1_j704ban wrote

I'm not trying to temper your dreams or anything but I think Data Science as an industry is going to completely change over the next 5-10 years. I'm not saying humans won't work in Data Science in 5-10. I'm saying, the Data Science industry is going to evolve a lot as it incorporates AI. I don't know what that will look like - do your professors?

If you go to an Ivy and are taught by the smartest person in this old world, learn all the best old world stuff, it will do you little good in the new world.

If data is interesting to you - maybe, at least as a hobby, be sure that you learn everything you need to completely setup and deploy a native GPT-AI (s) and can train for years on specific tasks/functions/intent, all tailored to whatever services you want to offer a business owner - this should be obvious as this gets closer.

Eventually, most of this AI generative tech will be locked behind corporate walled gardens and anything accessible to consumers will be lightyears behind - prices will skyrocket for basic stuff that people just no longer really do. This will be super cheap until it isn't.

That is when you come in with your own generative AI - still lightyears behind Microsoft, but won't be nerfed like the consumer AI that will be available.

Haha, of course Microsoft doesn't really lose anything by undercutting your company at that point... I'm really just trying to get you thinking.

Don't expect anything about the world today to stay the way it is today. Assume everything will be updated and changed. Try to see the world that will follow the transition - where do you fit?

tl;dr: I think everyone with the ability to run/train a limited native AI - should totally do that.

−1

PinusPinea t1_j713h2k wrote

Bayesian statistics is a set of fundamental principles, and will not be out of date in 5-10 years. The hype about AI for data science in industry is way overblown.

1

EnzoTrent t1_j719xal wrote

I'm aware it is a set of principles.

I keep having the same conversations - its like your talking about the 2022 pre-season to me right now in February, right before the Super Bowl. I'm having a hard time with where everyone seems to be at.

I'm sick of explaining things, so I'll assume your fairly familiar with Data Science.

An AI is going to do the cherry picking of our lives now - not a human being, or even an algorithm, a new thing.

Do you believe it is going to look at our data like a human would?

Do you not understand the immensity of what that means for Data Science?

So much new data is about collected out of the same world we collect data in now AND all of the data we collect now is about to be completely re-analyzed - that will also generate new data. All of this new data generated by the AI will then be managed by the AI - people won't be making sense of how they see the world fast enough to keep up, or at all.

The way all of that data is then cross tabulated and that data cross tabulated - How long do you think human beings are going to be able to understand what is happening? The Data won't look anything like data we see now but will be far more accurate.

What if it pulls something like the Meta AI and says - "oh I see how you structure data - I'm going to do it like this" the Meta AI created a further breakdown of time to meet its ends easier - how much harder do you think that made it for any human that now has to account for a new unit of time? I'm assuming its actually something Meta devs deal very little with - which is my point but I really do want to stress that we do not understand something that can adopt a new subsect of time on a whim.

What will AI code that only AI will ever interact look like? There is no reason to assume that it looks anything like what we would do.

I'm trying to put perspective on the scale and speed. I'm still hung up that you called this hype.

1

fuscarili OP t1_j71k5i1 wrote

I see, so in this sense you mean that Reinforcement Learning should be the choice?

Cause it's one the things Chat GPT uses , together with supervised learning.

1