Viewing a single comment thread. View all comments

Open-Dragonfly6825 OP t1_j72om7m wrote

Maybe I missed it, but the posts I read don't specify that. Some scientific works claim that FPGAs are better than GPUs both for training and inference.

Why would you say they are better only for inference? Wouldn't a GPU be faster for inference too? Or is it just that inference doesn't require high speeds and FPGAs are for their energy efficiency?

1

alex_bababu t1_j73ofte wrote

You know probably much more than me. My thoughts were, for inference you don't need the calculation power for backpropagation. The model is fixed and you can find a efficient way to program a fpga to run it.

Basically like an ASIC. And lsi more energy efficient.

You could map the model on the fpga in such a way, so you would not need to store intermediate results in memory.

2

Open-Dragonfly6825 OP t1_j74ntpw wrote

Hey, maybe it's true that I know my fair lot about acceleration devices. But, until you mentioned it, I had actually forgotten about backpropagation, which is something basic for deep learning. (Or, rather than forget, I hadn't thought about it.)

Now that you mentioned it, it makes so much sense why FPGAs might be better suited but only for inference.

1