omgitsjo

omgitsjo t1_ivm2qw6 wrote

Wouldn't an auto encoder run into the same issue? If the dataset is mostly zeros then every loss function I can think of would hit the same issue. PCA could be an option, but disappointing to introduct it into what is otherwise a pure UNet architecture.

1

omgitsjo t1_ive49sz wrote

Is there a good sparse loss function that also does regression? I have what basically amounts to an image to image problem, but the resulting image is a dense UV set (red channel goes from 0-255, green from 0-255). Most of the image is "no signal" so MSE tends to just predict all zeros after a while. I can't split the image into multuple channels because softmax over 255 values for red and 255 more channels for green would make me OOM. I might try and narrow it down to just 16 quantized channels each, but I'd really rather spit out a two channel image and do clever losses on that. I'm sure masking has some clever tricks like union over intersection, but those don't seem to handle regression cases, only boolean.

1