You know probably much more than me.
My thoughts were, for inference you don't need the calculation power for backpropagation.
The model is fixed and you can find a efficient way to program a fpga to run it.
Basically like an ASIC.
And lsi more energy efficient.
You could map the model on the fpga in such a way, so you would not need to store intermediate results in memory.
alex_bababu t1_j73ofte wrote
Reply to comment by Open-Dragonfly6825 in Why are FPGAs better than GPUs for deep learning? by Open-Dragonfly6825
You know probably much more than me. My thoughts were, for inference you don't need the calculation power for backpropagation. The model is fixed and you can find a efficient way to program a fpga to run it.
Basically like an ASIC. And lsi more energy efficient.
You could map the model on the fpga in such a way, so you would not need to store intermediate results in memory.