Viewing a single comment thread. View all comments

Blutorangensaft OP t1_j632b2s wrote

Using slightly different sentences to be decoded to the same sentence exists as an idea in the form of denoising autoencoders, yes. I plan to use this down the road, but for now I am interested in thinking about measuring performance.

1

crt09 t1_j633u7c wrote

I think there's miscommunication, it sounds like you think I'm proposing a training method but I'm suggesting how to measure smoothness.

If you have the BLEU distances between input sentences and the distances between their latents, you can see measure how the distances change between the two which I *think* would indicate smoothness. Or you could do some other measurements on the latents to see how smoothly(?) they are distributed? tbh I'm not entirely sure what you mean by smooth, sorry.

If you're looking to measure performance wouldn't that loss for the training method you be mentioned be useful?

Or are you looking for measuring performance on decoding side?

1

Blutorangensaft OP t1_j6344jf wrote

Ahh, I get you now, my apologies. I'm more interested in the performance on the decoding side indeed, because I want to later generate sentences in that latent space with another neural net and have them decoded to normal tokens.

1