youregonnalovemynuts

youregonnalovemynuts t1_j4pmvzs wrote

Hinton's original paper discusses the computational advantages this algorithm can provide. See page 14 of this PDF: https://www.cs.toronto.edu/~hinton/FFA13.pdf . Today though there isn't hardware readily available that will exploit these advantages so they'll remain theoretical for now. If there seems to be enough promise that such an algorithm will converge anywhere close to backprop, we'll see some attempts there, probably starting with some FPGAs and extra circuits.

What everyone should be asking is why MNIST? It's a toy dataset at this point, it should be trivial for someone like Hinton to scale these experiments to something closer to reality. MNIST is like mouse experiments of machine learning, maybe it's an early necessity , but it hardly says anything about actual viability.

7