Comments

You must log in or register to comment.

Dear-Acanthisitta698 t1_itjt7mg wrote

If you save optimizer as well then it will almost same (it is almot same because random functions e.g. dropout will have different value). If you does not save the optimizer, then it will use initial lr setting so it might results different.

0

Rediggo t1_itjua9p wrote

It sound like you are using a somewhat high-level interface for training. If that is the case I can only help with these three points:

1 when asking for help, try to provide some details about the specific implementation (for example: are you using huggingface? Pytorch linghtning? Some other thing? Did you check that some usual suspects are not causing any trouble?)

2 it is important to read the documentation for the tool you are using. Are you sure the training didn't stop because the loss wasn't improving and that's the default behavior?

3 stackoverflow usually has most of the basic questions solved. If your question was already asked by someone like 6 years ago and it has no replies, then it's probably just a mistake solvable by reading the docs a little bit (but that last point is just my experience)

Good luck with your project :)

1

vedrano- t1_itjzfy3 wrote

Yes, it should be okay.

Moreover, some API (Keras) have parameter in fit() function that accepts epoch from which to continue training, so next models can be saved under right name.

1