Viewing a single comment thread. View all comments

advstra t1_iw4in0h wrote

People are making fun of you but this is exactly how CS papers sound (literally the first sentence of the abstract: Neural networks embed the geometric structure of a data manifold lying in a high-dimensional space into latent representations.). And from what I could understand more or less you actually weren't that far off?

6

TheLastVegan t1_iw5a72q wrote

I was arguing that the paper's proposal could improve scaling by addressing the symptoms of lossy training methods, and suggested that weighted stochastics can already do this with style vectors.

6

advstra t1_iw6n49p wrote

So in the paper from a quick skim read they're suggesting a new method for data representation (pairwise similarities), and you suggest adding style vectors (which is another representation method essentially as far as I know) can improve it for multimodal tasks? I think that makes sense, reminds me of contextual word embeddings if I didn't misunderstand anything.

2