suflaj t1_j71c8j2 wrote
Generally, no. It would be better to just use all the classes you need now, and then use masks to regulate which classes are being tested at a given moment. The thing you are suggesting, even when done correctly, would not let the model learn about the relationships between different classes.
With neural network surgery, it's trivial to downscale, but fairly hard to upscale.
One thing you could test, ex. is try to cluster your images with vanilla pretrained resnet features. Then, once you need to add new classes, you can look at which images from the new class are the most similar to the ones from existing classes, and you can maybe get away with only finetuning it on that subset, instead of the whole dataset.
Obviously, finalization will include doing at least one epoch on the whole dataset, but that might not be viable to do n times, while the similarity method will be, you can just adjust the similarity threshold.
Viewing a single comment thread. View all comments