Submitted by Valachio t3_10ack6h in MachineLearning

ML is a part of AI but I don't hear about anything coming out of AI that's not done using some ML technique.

Is it fair to say that AI and ML are synonymous now in 2023? Or are there people who are still actively working on non-ML techniques for building AI?

166

Comments

You must log in or register to comment.

VirtualHat t1_j440ajs wrote

People tend to use AI and ML to mean similar things. But yes, in academia, we still research AI ideas that are not ML. And integrating good-old-fashioned-ai (GOFAI) with more modern ML is becoming an area of increasing research interest.

180

ElectronicCress3132 t1_j45f3yo wrote

> And integrating good-old-fashioned-ai > (GOFAI) with more modern ML is becoming an area of increasing research interest.

Any papers you recommend on this topic?

18

gaymuslimsocialist t1_j45gwsl wrote

Any of the papers in the AlphaZero family do this: they combine tree search with learned policy and value functions.

22

FallUpJV t1_j45sb9f wrote

Would you recommend any of those or is there one in particular that introduces well such techniques?

Edit : I'm asking this because I'm not very familiar with what tree search means in the first place

3

cdsmith t1_j460nf2 wrote

Tree search means precisely that: searching a tree. In the context of AlphaZero, the tree is the game tree. That is:

  • I can move my pawn to e4. Then:
    • You could move your knight to c6
      • ...
    • Or you could move your pawn to e6
      • ...
    • Or ...
  • Or, I could move my pawn to d4. Then:
    • You could take my pawn with your pawn on c5.
      • ...
    • Or you could move your knight to c6.
      • ...
    • Or you could move your pawn to d5.
      • ...
    • Or ...
  • Or, I could ...

That's it. The possible moves at each game state, and the game states that they lead to, form a tree. (Actually more like a DAG, since transpositions are possible, but it's often simplified by calling it a tree.) Searching that tree up to a certain depth amounts to thinking forward that many moves in the game. The way you search the tree is some variation on minimax: that is, you want to choose the best move for yourself now, but that means at the next level down, you want to pessimistically only consider the best move for your opponent (which is the worst one for you), etc. Variations come in terms of what order you visit the various nodes of the tree. You could just do a straight-forward depth-first traversal up to a certain depth, in which case this is traditional minimax search. You can refuse to ever visit some nodes, because you know they can't possibly matter, and that's alpha-beta pruning. You could even visit nodes in a random order, changing the likelihood of visiting each node based on a constantly updated estimate of how likely it is to matter, and that's roughly what happens in monte carlo tree search. Either way, you're just traversing that tree in some order.

AlphaZero combines this with machine learning by using two empirically trained machine learning algorithms to tweak the traversal order of the tree, by identifying moves that seem likely to be good, as well as to evaluate partially completed games to estimate how good they look for each player. But ultimately, the machine learning models just plug into certain holes in the tree search algorithm.

16

dr3aminc0de t1_j46y5o7 wrote

How does this translate to the other Alpha* AIs? I’m thinking of AlphaCraft (star craft) which has too many move options to model it as a tree.

3

Valachio OP t1_j440r6p wrote

That's cool. Apart from integrating GOFAI with ML, what other non-ML techniques are gaining popularity recently?

12

[deleted] t1_j46jso9 wrote

ML is AI, AI is not necessarily ML.

All birds are dinosaurs, not all dinosaurs are birds.

4

etoipi1 t1_j49mq79 wrote

Really? Birds are descendants of dinosaurs?

1

I_will_delete_myself t1_j4557ug wrote

Correct me if I am wrong AI: Niche part of ML ML: AI + Data Science

Edit: An “intelligent” computer uses AI to think like a human and perform tasks on its own. Machine learning is how a computer system develops its intelligence

https://azure.microsoft.com/en-us/solutions/ai/artificial-intelligence-vs-machine-learning/

−26

VirtualHat t1_j456msu wrote

Definitions shift a bit, and people disagree, but this is what I stick to...

AI: Any system that responds 'intelligently' to its environment. A thermostat is, therefore, AI.

ML: A system that gets better at a task with more data.

Therefore ML is a subset of AI, one specific way of achieving the goal.

16

I_will_delete_myself t1_j46jn9e wrote

Ok thank you. I kind of hate the mob mentality of this site though. It discourages learning and experimenting.

1

[deleted] t1_j45cnht wrote

[deleted]

−1

VirtualHat t1_j45dklv wrote

I think Russell and Norvig is a good place to start if you want to read more. The AI defintion is a taken from their textbook which is one of the most cited references I've ever seen. I do agree however that the first defintion has a problem. Namely with what 'intellegently' means.

The second defintion is just the textbook defintion of ML. Hard to argue with that one. It's taken from Tom Mitchell. Formally “A computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E.” (Machine Learning, Tom Mitchell, McGraw Hill, 1997).

I'd be curious to know what your thoughts on a good defintion for AI would be? This is an actively debated topic, and so far no one really has a great defintion (that I know of).

7

tell-me-the-truth- t1_j45e4gv wrote

yeah I can see the point behind ML definition.. i guess i was trying to say you don’t always get better with more data. the performance might saturate at some point or the new data you add could be garbage.. so i found it a bit odd to tie definition of ML to the quantity of data.. the definition you linked talks about experience.. i’m not sure how it’s defined.

0

VirtualHat t1_j45em2b wrote

Yes true! Most models will eventually saturate and perhaps and even become worse. I guess it's our job then to just make the algorithms better :). A great example of this is the new Large Langauge Models (LLM), which are trained on billions if not trillions of tokens, and still keep getting better :)

1

MustachedLobster t1_j45dp6k wrote

A thermostat responds to the environment. It turns on the heating when it gets too cold.

and the ML definition is just repeating the formal definition by Mitchell:

> A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.”

https://towardsdatascience.com/what-is-machine-learning-and-types-of-machine-learning-andrews-machine-learning-part-1-9cd9755bc647#:~:text=Tom%20Mitchell%20provides%20a%20more,simple%20example%20to%20understand%20better%20.

3

tell-me-the-truth- t1_j45eidu wrote

oh i mixed thermometer with thermostat.. yeah then i agree thermostat can be AI..

1

[deleted] t1_j45dk5m wrote

[deleted]

−13

VirtualHat t1_j45e2rs wrote

Everything is new in its current form :) AI, however, goes back a long way... perhaps Turing would be a reasonable starting point, though, with him writing about COMPUTING MACHINERY AND INTELLIGENCE back in 1950.

edit: gramma.

8

[deleted] t1_j45ek1n wrote

[deleted]

−9

VirtualHat t1_j45faub wrote

Genetic algorithms are a type of evolutionary algorithm, which are themselves a part of AI. Have a look at the wiki page.

I think I can see your point though. The term AI is used quite differently in research than in the popular meaning. We sometimes joke that the cultural definition of AI is "everything that can't yet be done with a computer" :)

This is a bit of a running joke in the field. Chess was AI, until we solved it, then it wasn't. Asking a computer random questions and getting an answer Star Trek style was AI until Google then it was just 'searching the internet'. The list goes on...

9

deustrader t1_j46zi5j wrote

I guess I would be concerned with claiming that evolutionary algorithms are AI, because that’s not how most people understand the current AI. And right now pretty much everything is being advertised as AI for marketing purposes, without being able to distinguish one solution from another. But you’ve made a good point.

1

Tart_Beginning t1_j45p120 wrote

Username checks out hehe. Jk, don’t get why people are downvoting you for being a lil wrong!

1

I_will_delete_myself t1_j46jclx wrote

I was just asking more of a question tbh than pretending to know it. It’s why I said correct me if I am wrong.

IDK, I guess I ran into the Reeeeeditors. Mob mentality is what drives the website and blind dislike. I just don’t worry about it and enjoy my life outside this site.

1

Ucalino t1_j44hgn1 wrote

No. Good old expert systems, which are other kinds of AI, are still very common in a lot of industries. While there is very little research on this kind of AI, there is a lot of work and engineering applied on it.

36

iidealized t1_j45abq3 wrote

There are still many search-based advances/breakthroughs coming out that utilize ML but GOFAI as well, eg. Cicero AI for Diplomacy

15

MegavirusOfDoom t1_j45xudz wrote

GOFAI is encompassed within the logic of ML today, so it's actually evolved into NN symbolism and that's fine with me. ML heavely applies many systems of GOFAI.

Intelligence is a result of learning, so the science of data acquisition is synonymous with AI. The AI is the jug of water when it's filled, the learning is the filling of the jug, perhaps the machine is the jug that can contain networked ideas.

−4

huehue12132 t1_j43ikwm wrote

It doesn't really matter whether or not there are people *working* on other stuff -- the two terms are different by definition.

13

Red-Portal t1_j43wds7 wrote

You'll have a hard time finding non-ML approaches to AI, but there are still plenty of non-AI applications of ML. For example, classical topics like kernel methods, learning theory, optimization, all ML topics that are not-so AI flavored.

11

sabertoothedhedgehog t1_j44ft8i wrote

Your statement only makes sense if one's mental model is not a Venn diagram of concentric circles: AI > ML > Deep Learning.

17

sabertoothedhedgehog t1_j44kct1 wrote

(Which is OK. Sebastian Raschka, for example, has this view that AI and ML are not concentric circles but merely overlapping. He thinks a cat vs. dog classifier is so narrow, it is not aiming for the larger vision of AI. Fair.)

12

Smallpaul t1_j46hv6v wrote

I think marketers have really influenced the definitions. Calling a linear regression house price estimator "AI" seems like a stretch unless you're trying to get venture capitalists excited. But today, most business people probably will.

Is the Amazon product recommender system "AI"? Is it ML?

5

sabertoothedhedgehog t1_j46i4j6 wrote

I agree.

I still resist though and do not say that "I have developed an AI that does xyz" but say that "I developed a predictive ML model that does xyz".

1

Skirlaxx t1_j45kzue wrote

I think AI is the general term that machine learning falls to. For example minimax is AI but it has nothing to do with machine learning.

8

cruddybanana1102 t1_j4asit0 wrote

I mean Generative Adversarial Netwroks do engage in minimax optimization, and produxe deepfakes. I don't think anybody agrees that GANs have nothing to do machine learning.

−1

Skirlaxx t1_j4b3du6 wrote

I meant the minimax algorithm. Like the one for tic tac toe.

3

Skirlaxx t1_j4b3jou wrote

That just looks in the game tree and finds the highest score. What does that have to do with machine learning?

1

idontcareaboutthenam t1_j4bwkn3 wrote

It's not machine learning, it's and example of classic AI. One of the first search algorithms for game trees. Even modern systems for chess such as Stockfish follow the same idea. They work on advanced versions of the alpha-beta algorithm which an advanced version of the minimax algorithm.

1

Skirlaxx t1_j4c8dta wrote

I know how it works! That was my whole argument. But thanks though 🤣

1

tamale t1_j45cxqq wrote

There are many, many examples of AI that have nothing to do with machine learning.

In fact, I'd wager that the vast majority of currently running AI code out there is not machine learning AI code at all but instead more rudimentary algorithms which could still be classified as a form of AI.

Almost all chess and gaming code for instance use various flavors of AI algorithms save for the most sophisticated coming from groups like deepmind.

6

taleofbenji t1_j469lvh wrote

The issue is that you have to call it AI to get any kind of media attention or buzz.

6

Vyper4 t1_j46oe5c wrote

Eh, idk, using the term "machine learning" gets a lot of attention now too, to the point that it's kind of a buzzword even though it is a legitimate field of study. I think its more that the average non-technical person hears "AI" and "machine learning" and assumes they're the same thing.

1

redditsucks1337 t1_j452lei wrote

I view ML as a subset of AI.

AI has been around since the 50's, it just means appearing intelligent. ML is a subset of AI where algorithm outputs achieve the same initial goal of AI.

5

chief167 t1_j45e8ru wrote

I view ML as a way to create an accurate model, and AI as a way to have a model take decisions in a complex environment without humans in the loop.

There is a large overlap, but you can easily have one without the other.

−6

Ulfgardleo t1_j46aqfw wrote

you get downvoted, but you are right. There is nothing intelligent about an accurate regression model. It is the application of that regression model to a certain task that we anthropomorphize to "intelligence".

1

nmkd t1_j45qfqn wrote

To the general public, absolutely, yes

3

ah-tzib-of-alaska t1_j44k95h wrote

pobably, i have definitely been watching people use AI instead of what is definitely AI

2

KingGongzilla t1_j457jo1 wrote

i currently have some Logix/Symbolic AI course. So, yes

2

xepherys t1_j460hdb wrote

All ML is AI. Not all AI is ML.

ML is just the current darling of AI.

It’s like asking “are electric vehicles synonymous with cars”. No. Most discussion about cars today focus on EVs, but the vast majority are still ICE.

Similarly, most deployed AI solutions today still are not ML. ML is just what people are talking about because ML is experiencing significant growth and research.

2

grandzooby t1_j474sza wrote

Naieve Bayes Classifiers, support vector machines, decision trees, k-means clustering... these are part of machine learning but not AI. They may be sometimes used in AI, but so are processors and memory chips but those aren't AI either.

"Modern AI" is almost entirely made of neural networks, which is merely one of the topics in machine learning. Older AI was based on things like A*, formal logic systems, fuzzy logic, etc., little of which is "machine learning".

−2

l_dang t1_j4635pc wrote

ermm... you got it in reverse. AI is a sub-field of Machine Learning, which in itself is a sub-field of Statistical Learning. For example, linear regression is generally not consider AI but it's most definitely the corner stone of ML/SL

−9

sabertoothedhedgehog t1_j4660hr wrote

To me, a linear regression is part of Machine Learning, and, thus, part of the broader vision of AI. Even though linear regressions are old statistical models and probably existed long before the term ML.The linear regression algorithm is learning from data (i.e. improves the line fitting after observing more data. hence, it is ML in my book) -- it just has a very limited hypothesis space. It will only ever fit a straight line (or hyperplane, in the general case). It is not a general learner like a Deep Neural Network which can approximate any function.

2

xepherys t1_j463s2u wrote

https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=68508ffc9f75462fd31de620d03093b214734011

“Machine learning is one of the most exciting recent technologies in Artificial Intelligence.”

Linear regression is used in ML, but it is neither ML not AI. LR is a statistical model. That’s like saying some equation is used in calculus, but “not in math” so calculus isn’t a subset of mathematics.

Sorry bud, but you definitely have it reversed.

1

noptamoius t1_j45kz5m wrote

Not sure on what AI is from a technical standpoint, I always thought it was a concept.

1

hoffmanmclaunsky t1_j46ilb6 wrote

Generally speaking it is more a concept. “Machine Learning” is the more rigorously defined subject. Machine learning specifically means iteratively modifying a model using training data, then applying that model to real world data. AI is a bit more of a nebulous concept, but generally speaking it's just using some search algorithm with heuristics to make the search more "intelligent".

−1

sabertoothedhedgehog t1_j46x3zu wrote

>AI is a bit more of a nebulous concept, but generally speaking it's just using some search algorithm with heuristics to make the search more "intelligent".

No. Just no.

AI is the vision / effort / field of study that deals with replicating (human) intelligence.

2

hoffmanmclaunsky t1_j46zaf3 wrote

Is this something you've studied at a university? I only took a few AI/ML classes at uni so I'm not going to pretend to be an expert.

In any case "field of study that deals with replicating intelligence" isn't exactly a rigorous definition. Really that description speaks to how broad and nebulous it is.

1

sabertoothedhedgehog t1_j4701ft wrote

Yes. My PhD was on applied ML. My current day job is at a center for AI. There are many people dimensions smarter than me -- but AI is all I deal with every day.

The reason for the nebulous concept is that intelligence is hard to define. Thus, in the past it was often defined by relating it back to human intelligence, e.g. "automating tasks that would require human intelligence to solve" and even the Turing Test.
But there are harder definitions of intelligence, such as Francois Chollet's paper.

It is definitely NOT correct to say << [AI] is just using some search algorithm with heuristics to make the search more "intelligent">>.
AI covers way more and goes far beyond search.

2

AImSamy t1_j45nt88 wrote

I know that in France a lot of university labs still have teams working on old (https://en.wikipedia.org/wiki/Artificial_intelligenc) techniques.

1

WikiSummarizerBot t1_j45nuby wrote

Artificial intelligence

>Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by non-human animals and humans. Example tasks in which this is done include speech recognition, computer vision, translation between (natural) languages, as well as other mappings of inputs. The Oxford English Dictionary of Oxford University Press defines artificial intelligence as: the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)

1

Qkumbazoo t1_j463qgw wrote

AI is a wider umbrella of methods and applications, not all is modeling or pattern recognition based - for example a very popular form of machine intelligence was fuzzy logic, that is also AI. It has evolved to using neural networks and unsupervised learning methods to create the membership functions today.

1

cuanhunter1308 t1_j469y8k wrote

A bit agree on the question title. But there has been quite separated explanation yet distinguished ML with AI.

One talk about machine perception, one is more holistic system intelligence

1

DigThatData t1_j46bnn7 wrote

AI has basically become a buzzword that means "this thing is capable of achieving what it does because it's powered by ML", and in this context especially, ML has become synonymous with deep learning.

1

Mystery-Magic t1_j46cyi4 wrote

This is because making AI's without using ML is much more harder and expensive. And now everyone can make neural networks with less than 50 lines of code, so we tend to ignore AI's which aren't made using ML because they are over shadowed heavily by functionality and efficiency of ML ones.

1

BrightScreenInMyFace t1_j46louf wrote

There are people that use an automata theory approach to natural language processing.

These people’s gripe with ML is that it has a major computational constraint.

1

Relic_Chaser t1_j46pffp wrote

What's the 'old' joke? If you work in R it's statistical learning. If you're working in Python, it's machine learning. And it's AI if you're working in Powerpoint.

1

Possibility_Antique t1_j46q8tx wrote

No. I would argue, for instance, that a particle filter is probably considered an AI technique, but it is not an ML technique. Similarly, some optimization algorithms such as genetic algorithms kind of fall into that category. I more attribute ML to the whole "using some form of gradient descent to tune data that may not have a physical or statistical interpretation" thing that we do with ANNs because it is really the training process that differentiates it from GOFAI and others.

1

broadenandbuild t1_j47zitd wrote

I feel like ML are the building blocks for AI

1

Dividingblades t1_j4832fb wrote

I am currently writing my thesis on an AI-related topic and from what I gathered in my interviews the answer is yes. However, some of my participants also criticised this and said that “ AI is not a collection of ML algorithms”. I agree! AI is not ML. It is just a component.

1

karriesully t1_j49debi wrote

Depends on who you’re talking to. Most people don’t know the difference. If you’re talking with those that do - you’d better acknowledge the difference or they think you’re dumb.

1

IcySnowy t1_j4filbx wrote

I agree, at first I thought AI consists of ML and DL, after reading the book Artificial Intelligence by Oxford I realized there is much bigger AI out there

1

KerbalsFTW t1_j4wtp6e wrote

> Is it fair to say that AI and ML are synonymous now in 2023? Or are there people who are still actively working on non-ML techniques for building AI?

AI means "I am a lay person or a media person talking to lay people".

ML means "I know what I'm talking about".

AGI means "I know what I'm talking about but I don't know what it is or how to build it".

The term 'AI' followed a previous hype-then-disappointment curve and got a bad name. Researchers restricted themselves to "things that worked" and called it Machine Learning to imply that we are teaching models and they are learning which is obviously true, rather than implying "this thing is intelligent" which it probably isn't.

Side topic: humans keep moving the bar on what counts as intelligent. It used to be "can play chess" and it then was "can play Go and hold a conversation" and then it was "can draw and show creativity". Humans will keep moving the bar on the definition of intelligence for as long as (humanly) possible.

1

Vyper4 t1_j46nhqo wrote

Machine learning is still a subset of AI. That being said I think its clear that ML is the most popular subset of AI at the moment, and so when the average person hears of AI they likely think of ML as well, to the point that some people may just use the two terms interchangeably.

0

iLoveDelayPedals t1_j46ebmy wrote

I think the whole concept of AI is bizarre to be honest. What are people? We’re chemical and electrical connections responding in a closed system. Consciousness is just an illusory result of various stimuli coming together.

The only difference between a human brain and a computer algorithm is the complexity/amount of reaction. If an algorithm can learn and respond, what’s the difference?

Humanity’s obsession with ideas like the soul etc color the whole conversation around AI way too much.

That is to say I don’t know the answer to OP’s question ☠️

−1

evanthebouncy t1_j45eptc wrote

No.

AI is about problems. ML is a solution to these problems.

−2

JClub t1_j46sm7q wrote

If you see it on slides, it is AI. If you see it in Python, it is ML.

−2

rehrev t1_j45znkx wrote

İts just jargon

İt's not intelligence and they are not learning.

−6

sabertoothedhedgehog t1_j465kq6 wrote

This is not correct.
These algorithms are definitely learning (i.e. improving performance at a task through experience, i.e. by observing more data).

Intelligence is hard to define. Something like 'efficiency at acquiring skills at a broad range of tasks' would be one definition. We're getting there. This is the weak vs strong AI hypothesis: can we merely simulate intelligence or are we creating actual intelligence.

2

[deleted] t1_j472mgw wrote

[removed]

−2

sabertoothedhedgehog t1_j473src wrote

Why are you both condescending and factually wrong?

(a) They ARE learning.
(b) The learning definition I used is the one that defines ML according to Tom Mitchell. You thinking it was some kind of useless definition tells more about you.

1

[deleted] t1_j474m07 wrote

[removed]

−2

[deleted] t1_j4751n4 wrote

[removed]

1

[deleted] t1_j481hcs wrote

[removed]

1

wind_dude t1_j43qjja wrote

I used to be of the mind set that everything called AI is just ML, and we aren't even close to achieving AI. Well still technically true, now AI is AGI, and LLMs are AI. I've given up that fight. But yes basically synonyms in our current lexicon, just call ML AI to sound cooler to scifi fans and tech journalist.

&#x200B;

To me...

AI is an abstract concept that a computer can achieve cognitive abilities, emotion, and problem solving to the same level of a human.

ML is statistical and mathematical models. Basically the limit of what can be achieved on logic based hardware.

−8

Sirisian t1_j44fz5x wrote

> now AI is AGI

It's not. Keep correcting people. AI is task or multi-task specific. It's perfectly fine for someone to say ChatGPT is a dialog AI, for example. It completes tasks (multiple in this case) using an artificial intelligence that happens to be created using machine learning techniques.

What you're describing in your second part is AGI. Non task-specific problem solving at the level of a human. The boundary between advanced AI, especially multi-task learning models, and AGI will get smaller and fuzzy in the coming decades, but there is a difference. There will be very advanced AIs that check a lot of boxes researchers create. When an AGI is created there are effectively no boxes left to check. The system can collect data, reason, and create a logical solution for any test way outside of its training data.

8

wind_dude t1_j4767yf wrote

>ChatGPT is a dialog AI,

OpenAI doesn't even call chatGPT AI or dialog AI.

&#x200B;

>Non task-specific problem solving at the level of a human. The boundary between advanced AI, especially multi-task learning models, and AGI will get smaller and fuzzy in the coming decades, but there is a difference.

That is closer to the original older definition of AI, I don't think I've seen one credible definition that doesn't include problem solving. AGI just started getting used because AI had been used to much to refer to stuff that isn't AI.

0

chief167 t1_j45eo3z wrote

AI is basically decision making. Giving information, how does a machine learn from its environment, take decisions, without human oversight. How does a machine adapt itself with more experience.

ML is just a way to create models.

For example the SLAM algorithm is an important algorithm in AI, because it allows robots to map their environment. However, this is not ML at all.

Another example of AI is knowledge graphs, like the earliest chess engines. A perfect chess AI can be made without any machine learning at all.

It's important to keep making the distinction.

0

MustachedLobster t1_j46hfdz wrote

Slam exactly fits the definition of ml.

The more data you give it, the better the map gets, and the better we expect localisation to be.

It has no generalisation at all, but it is learning something very specific about a particular environment.

1

chief167 t1_j46k3aa wrote

Slam is pure Mechatronics which I don't consider ml.

1

MustachedLobster t1_j47oa1s wrote

It exactly matches Mitchell's definition of ml though.

> A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.

https://towardsdatascience.com/what-is-machine-learning-and-types-of-machine-learning-andrews-machine-learning-part-1-9cd9755bc647#:~:text=Tom%20Mitchell%20provides%20a%20more,simple%20example%20to%20understand%20better%20.

Localisation error decreases the more data you have.

1

happygilmore001 t1_j43tc51 wrote

AI is powerpoint. ML is python.

when AI.ppt becomes viable in .py, we recategorize it back to ML.

−18

sabertoothedhedgehog t1_j44f44k wrote

This is not true. It sounds like an educated statement - but it isn't true.

15

sabertoothedhedgehog t1_j44ff9b wrote

I should explain to be useful:AI is the vision and effort to replicate human intelligence. Human intelligence includes learning from data (--> ML). But one could argue there is more to it, e.g. knowledge bases etc. This is not something typical ML algorithms consider (LLMs do that indirectly). Also, our current ML models are still super narrow. The idea of AI is general intelligence.

3

nicholsz t1_j442ph8 wrote

I think of AI as a subset of ML, focused on deep learning theory, architecture, and application.

−18

sabertoothedhedgehog t1_j44g7p9 wrote

There are many valid and different views on AI, ML, Deep Learning.

This is not one of them. It's plain wrong.

5

nicholsz t1_j44gxn1 wrote

I mean it's the current reality.

AI meant something very, very different 30 years ago than it means now. Massive commercial success in a few areas will do that.

−4

suflaj t1_j43gqqp wrote

They're not synonymous, ex. DL is not considered ML, and of course there is other AI that is not a strict subset of ML., ex. expert systems

−19

TeamRocketsSecretary t1_j43moxu wrote

What? DL is very much considered a subset of ML which itself is a subset of AI

18

suflaj t1_j43urpb wrote

Sure, but it is not considered synonymous. When people say ML, they usually mean linear regression, bayesian optimization and gradient boosting, not necessarily artificial neural networks with backpropagation and some version of gradient descent.

Expert learning is also a subset of ML, yet they are not considered synonymous.

The same way we say ML is distinct from AI because it implies learning, we hold DL to be distinct from ML because these are not exactly statistical methods and it's mostly alchemy, and we hold expert systems as distinct from ML because it's just a fancy way of saying rule-based AI and it doesn't imply there's any learning involved.

One must realize that mathematical relations do not perfectly map onto human language and communication. Similarly to how a skirt is a part of a dress, yet we consider them different things, subsets of ML are not considered ML itself in language and communication.

−11

sabertoothedhedgehog t1_j44kn6r wrote

Your statement is incorrect. When people (in the field) say ML they mean the whole toolbox of learning algorithms, incl. Deep Learning, trees & forests, kernel methods, etc.

4