Viewing a single comment thread. View all comments

neilk t1_j6lcux6 wrote

Think of it this way.

The CPU is like a chef in a restaurant. It sees an order coming in for a steak and potatoes and salad. It gets to work cooking those things. It starts the steak in a pan. It has to watch the steak carefully, and flip it at the right time. The potatoes have to be partially boiled in a pot, then finished in the same pan as the steak.

Meanwhile, the CPU delegates the salad to the GPU. The GPU is a guy who operates an entire table full of salad chopping machines. He can only do one thing: chop vegetables. But he can stuff carrots, lettuce, cucumbers, and everything else, into all the machines at once, press the button, and watch it spit out perfect results, far faster than a chef could do.

Back to the programming world.

The CPU excels at processing the main logic of a computer program. The result of one computation will be important for the next part, so it can only do so many things at once.

The GPU excels at getting a ridiculous amount of data and then doing the same processing on ALL of it at the same time. It is particularly good at the kind of math that arranges thousands of little triangles in just the right way to look like a 3D object.

28

Gigantic_Idiot t1_j6ldrul wrote

Another analogy I've seen. Mythbusters explained the difference by making a painting with paintball guns. A CPU is like shooting the same gun 1000 times, but a GPU is like shooting 1000 guns all at once

19

HappyGick t1_j6oszfu wrote

Oh my god, that's actually a genius analogy, it's as close to real life as you can get without details

4