Submitted by Open-Dragonfly6825 t3_10s3u1s in deeplearning
alex_bababu t1_j73ofte wrote
Reply to comment by Open-Dragonfly6825 in Why are FPGAs better than GPUs for deep learning? by Open-Dragonfly6825
You know probably much more than me. My thoughts were, for inference you don't need the calculation power for backpropagation. The model is fixed and you can find a efficient way to program a fpga to run it.
Basically like an ASIC. And lsi more energy efficient.
You could map the model on the fpga in such a way, so you would not need to store intermediate results in memory.
Open-Dragonfly6825 OP t1_j74ntpw wrote
Hey, maybe it's true that I know my fair lot about acceleration devices. But, until you mentioned it, I had actually forgotten about backpropagation, which is something basic for deep learning. (Or, rather than forget, I hadn't thought about it.)
Now that you mentioned it, it makes so much sense why FPGAs might be better suited but only for inference.
Viewing a single comment thread. View all comments