Submitted by Nerveregenerator t3_z0msvy in deeplearning
C0demunkee t1_ix85rbq wrote
Reply to comment by Star-Bandit in GPU QUESTION by Nerveregenerator
I did this with a M40 24gb, super cheap, no video out, lots of cuda cores, does all the ML/AI stuff I want it to do.
Star-Bandit t1_ix9toom wrote
Interesting, to I'll have to look into the specs of the M40, have you had any issues with running out of space with vram? All my models seem to gobble it up, though I've done almost no optimizations since I've just recently gotten into ML stuff
C0demunkee t1_ixd9fdq wrote
yeah you can easily use it all up from both image scale and batch size. Also some models are a bit heavy and don't leave any for the actual generation.
Try "pruned" models, they are smaller.
since the training sets are all on 512x512 images it makes the most sense to generate at that res and then upscale.
Viewing a single comment thread. View all comments