Viewing a single comment thread. View all comments

Teotz t1_izjzdve wrote

1

LetterRip t1_izksf4k wrote

It is working, but I need to use prior preservation loss, otherwise all of the words in the phrase have the concept bleed into them. So generating photos for preservation loss now.

1

LetterRip t1_izm8rkq wrote

It did work, now I can no longer launch lora training even with 768 or 512 (CUDA VRAM exceeded), only 256 no idea what changed.

1

JanssonsFrestelse t1_j0l89ve wrote

Same here with 8GB VRAM, although looks like I can't use mixed_precision=fp16 with my RTX 2070, so that might be why.

1