Viewing a single comment thread. View all comments

new_name_who_dis_ t1_j5zoc0t wrote

They are not gradient-descent based (so they don't need GPU acceleration as much, but sometimes times still do depending on the problem) but they are definitely ML.

12

fernandocamargoti t1_j5zs45e wrote

They not about learning from data, they are about optimization. They are from the broader AI field of study, but I wouldn't say they are ML. They serve a different purpose. Even though there are some research about using them to optimize models (instead of using gradient descent), but it's not their main use case.

−7

ReginaldIII t1_j5zv9gz wrote

Thats such a tenuous distinction and you're wrong anyway because you can pose any learning from data problem as a generic optimization problem.

They're very useful when your loss function is not differentiable but you still want to fit a model to input+output data pairs.

They're also useful when your model parameters have domain specific meaning and you can derive rules for how two parameter sets can be meaningfully combined with one another

Decision trees and random forests are ML too. What you probably mean is Deep Learning. But even that has a fuzzy boundary to surrounding methods.

Being a prescriptionist with these definitions is a waste of time because the research community as a whole cannot draw clear lines in the sand.

10

fernandocamargoti t1_j60xagg wrote

Well, what you talking about is some ways to use evolutionary algorithms to optimize the parameters of a ML model. But in my eyes, it doesn't mean it is ML. They both share a lot, but they aren't the same. For me, evolutionary algorithms is part of Meta Heuristics, which is part of AI (which ML is also part of). Different areas and sub areas of research do interact with each other. I just mean that the is part is a bit too much in this case.

−1

ReginaldIII t1_j61nlno wrote

Trying to force these things into a pure hierarchy sounds nothing short of an exercise in pedantry.

And to what end? You make up your own distinctions that no one else agrees with and you lose your ability to communicate ideas to people because you're talking a different language to them.

If you are so caught up on the "is a" part. Have you studied any programming languages that support "multiple inheritance" ?

2

new_name_who_dis_ t1_j601m4q wrote

Gradient descent is also about optimization... You can optimize even neural networks with a bunch of different methods other than gradient descent (including evolutionary methods). They don't work as well but you can still do it.

3