Submitted by Moose_a_Lini t3_yjwvav in MachineLearning
The specific application is for orbital cameras - networks can be trained on earth and then sent to orbital FPGA's for use in image recognition systems. Both the earth based training system and orbital FPGA have a lot of computational power so there is no real need for reduction there, but transmission bandwidth is incredibly limited.
For context I'm trying to find a PhD topic - I have a strong background in FPGA's, space-bourne imaging systems and comms, but I'm a machine learning noob (currently furiously trying to get my head around it).
I may not be using the right terminology but my searches haven't turned up anything. (Also it may be the case that due to some information theory reason that Pruning is the optimal solution to this problem).
Any suggestions of papers, pointers in a direction or any other related tidbits would be highly appreciated.
bernhard-lehner t1_iuqc992 wrote
It would help if you explain what exactly you want to transmit, the model, results, gradients,...? Btw, how would pruning not reduce the computational demand?