Viewing a single comment thread. View all comments

ainap__ t1_j13ps2q wrote

Cool! Why do you think that for the base FF memory requirement keep increasing with the number of layers?

1

galaxy_dweller OP t1_j13sd41 wrote

>Cool! Why do you think that for the base FF memory requirement keep increasing with the number of layers?

Hi u/ainap__! The memory usage of the forward-forward algorithm increases respect to the number of layers, but significantly less respect to the backpropagation algorithm. This is due to the fact that the increase in memory usage for forward-forward algorithm is just related to the number of parameters of the network: each layer contains 2000x2000 parameters which when trained using the Adam optimizer occupies approximately 64 MB. The total memory occupied difference between n_layers=2 and n_layers=47 is approximately 2.8 GB which corresponds to 64MB * 45 layers

3