colugo

colugo t1_j2xp63o wrote

Think of it more like this: the total intelligence/capability of all humans greatly exceeds the total intelligence/capability of any individual human. When we train ai models, we are imbuing them with capabilities derived from many humans. But once produced, they are easily copied so we could quickly have a population of them, where each individual ai is equivalent to many humans (and thus greater than any individual human) but maybe less capable than all humans. Eventually the population of enough such ais is more capable than the population of humans.

And then, if the ais can train new generations from prior ais, this pattern could repeat and explode.

1

colugo t1_j1y5k71 wrote

I kind of feel like the answer is, if you are doing the kind of work that needs more RAM, you'd know.

In deep learning in particular, RAM would affect your maximum batch size which could limit how you train models. I'm not sure which particular hard limits you'd come up against in other machine learning. More RAM is helpful, sure, but you can usually use less with more efficient code.

9

colugo t1_iv5nzoq wrote

Whichever courses seem higher quality in your specific instance. Like is the professor well-regarded? Is the coursework rigorous and relevant? What do people say about the class?

If you've never done calculus, it seems prerequisite for deeper probability/stats, so I'd lean slightly that way.

2