Indeed, there are so many natural methods to interpolate concepts, and I agree 100% that there some are better than others at certain tasks.
Compared to famous Img2Img, I understood this as a "generalized" method to interpolate. Since if you take \mu = 1.0, this becomes just Img2Img interpolation. You can read the paper to see the effect of \mu on interpolation, and it's quite interesting. Since this is more general approach, there are more things to tweak and figure out I guess...?

cloneofsimoOP t1_izdlve0 wroteReply to comment by

LetterRipin[P] Using LoRA to efficiently fine-tune diffusion models. Output model less than 4MB, two times faster to train, with better performance. (Again, with Stable Diffusion)bycloneofsimoGlad it worked for you with such small memory constraints!