neilk

neilk t1_j6lf4hb wrote

The CPU is like "here are all the 30 thousand triangles that represent this thing, and here is the angle from which I would like to view it. Please do the complex mathematical transformations that a) rotate all 30 thousand triangles in space b) project all 30 thousand triangles from 3D space into 2D triangles on a screen"

There's also stuff to figure out what parts of the model are hidden from view, reflections, textures, shadows, etc, but you get it.

24

neilk t1_j6lcux6 wrote

Think of it this way.

The CPU is like a chef in a restaurant. It sees an order coming in for a steak and potatoes and salad. It gets to work cooking those things. It starts the steak in a pan. It has to watch the steak carefully, and flip it at the right time. The potatoes have to be partially boiled in a pot, then finished in the same pan as the steak.

Meanwhile, the CPU delegates the salad to the GPU. The GPU is a guy who operates an entire table full of salad chopping machines. He can only do one thing: chop vegetables. But he can stuff carrots, lettuce, cucumbers, and everything else, into all the machines at once, press the button, and watch it spit out perfect results, far faster than a chef could do.

Back to the programming world.

The CPU excels at processing the main logic of a computer program. The result of one computation will be important for the next part, so it can only do so many things at once.

The GPU excels at getting a ridiculous amount of data and then doing the same processing on ALL of it at the same time. It is particularly good at the kind of math that arranges thousands of little triangles in just the right way to look like a 3D object.

28