Submitted by mrx-ai t3_zjud5l in MachineLearning
modeless t1_izzpcbe wrote
Reply to comment by IshKebab in [D] G. Hinton proposes FF – an alternative to Backprop by mrx-ai
He calls it "mortal computation". Like instead of loading identical pretrained weights into every robot brain you actually train each brain individually, and then when they die their experience is lost. Just like humans! (Except you can probably train them in simulation, "The Matrix"-style.) But the advantage is that by relaxing the repeatability requirement you get hardware that is orders of magnitude cheaper and more efficient, so for any given budget it is much, much more capable. Maybe. I tend to think that won't be the case, but who knows.
ChuckSeven t1_j016rtg wrote
Why exactly is hardware cheaper and more efficient?
modeless t1_j02fiss wrote
Without the requirement for exact repeatability you can use analog circuits instead of digital, and your manufacturing tolerances are greatly relaxed. You can use error-prone methods like self assembly instead of EUV photolithography in ten billion dollar cleanrooms.
Again, I don't really buy it but there's an argument to be made.
Viewing a single comment thread. View all comments