Viewing a single comment thread. View all comments

alex_bababu t1_j73ofte wrote

You know probably much more than me. My thoughts were, for inference you don't need the calculation power for backpropagation. The model is fixed and you can find a efficient way to program a fpga to run it.

Basically like an ASIC. And lsi more energy efficient.

You could map the model on the fpga in such a way, so you would not need to store intermediate results in memory.

2

Open-Dragonfly6825 OP t1_j74ntpw wrote

Hey, maybe it's true that I know my fair lot about acceleration devices. But, until you mentioned it, I had actually forgotten about backpropagation, which is something basic for deep learning. (Or, rather than forget, I hadn't thought about it.)

Now that you mentioned it, it makes so much sense why FPGAs might be better suited but only for inference.

1