Viewing a single comment thread. View all comments

brucebay t1_j9g8al3 wrote

Thank you for this. I never used lora except part of stable diffusion training. You linked MS lora lib too. What are the differences between yours and theirs?

3

cccntu OP t1_j9hsouu wrote

Theirs requires you to rewrite the whole model and replace every layer you want to apply LoRA to with the LoRA counterpart, or use monky-patching.Mine utilizes PyTorch parametrizations to inject the LoRA logic to existing models. If your model has nn.Linear, you can call add_lora(model) to add LoRA to all the linear layers. And it's not limited to Linear, you can see how I extended it to Embedding, Conv2d in a couple lines of code. https://github.com/cccntu/minLoRA/blob/main/minlora/model.py

9

brucebay t1_j9i2kd6 wrote

Thank you for this clear explanation.

2