Comments

You must log in or register to comment.

crt09 t1_j70nrk5 wrote

Could probably take the output before the classification layer, feed it into an SVM and just train the svm with the class you're looking for

8

SnooHesitations8849 t1_j70q9ld wrote

Looking for some model like prototypical netowork-like. using distance to classify the new class

1

A_Again t1_j716fo0 wrote

You could always correlate the existing weights to the existing classes in the dataset and wipe the lowest-N correlated weights from each layer while adding a new output with new weights. this could catastrophically impact performance but also would guarantee you minimize impact on existing classes ...

I work with AI but can't guarantee this works since you have no notion of how weights earlier in the network impact latter layers....

1

suflaj t1_j71c8j2 wrote

Generally, no. It would be better to just use all the classes you need now, and then use masks to regulate which classes are being tested at a given moment. The thing you are suggesting, even when done correctly, would not let the model learn about the relationships between different classes.

With neural network surgery, it's trivial to downscale, but fairly hard to upscale.

One thing you could test, ex. is try to cluster your images with vanilla pretrained resnet features. Then, once you need to add new classes, you can look at which images from the new class are the most similar to the ones from existing classes, and you can maybe get away with only finetuning it on that subset, instead of the whole dataset.

Obviously, finalization will include doing at least one epoch on the whole dataset, but that might not be viable to do n times, while the similarity method will be, you can just adjust the similarity threshold.

3

Meddhouib10 t1_j72529u wrote

Yes ! You only need to change the last classifier layer (and initialize the added weights) to add more outputs and then further train the model on data containing all the classes (including the new ones)

4