Viewing a single comment thread. View all comments

LetterRip t1_izm8rkq wrote

It did work, now I can no longer launch lora training even with 768 or 512 (CUDA VRAM exceeded), only 256 no idea what changed.

1

JanssonsFrestelse t1_j0l89ve wrote

Same here with 8GB VRAM, although looks like I can't use mixed_precision=fp16 with my RTX 2070, so that might be why.

1