Submitted by Nerveregenerator t3_z0msvy in deeplearning
Star-Bandit t1_ix9toom wrote
Reply to comment by C0demunkee in GPU QUESTION by Nerveregenerator
Interesting, to I'll have to look into the specs of the M40, have you had any issues with running out of space with vram? All my models seem to gobble it up, though I've done almost no optimizations since I've just recently gotten into ML stuff
C0demunkee t1_ixd9fdq wrote
yeah you can easily use it all up from both image scale and batch size. Also some models are a bit heavy and don't leave any for the actual generation.
Try "pruned" models, they are smaller.
since the training sets are all on 512x512 images it makes the most sense to generate at that res and then upscale.
Viewing a single comment thread. View all comments