Comments

You must log in or register to comment.

Education-Sea t1_iu4txlg wrote

> PFGMs constitute an exciting foundation for new avenues of research, especially given that they are 10-20 times faster than Diffusion Models on image generation tasks, with comparable performance.

Oh this is great.

36

SleekEagle OP t1_iu569xx wrote

I don't think the paper explicitly says anything about this, but I would expect them to be similar. If anything I would imagine they would require less memory, but not more. That having been said, if you're thinking of e.g. DALL-E 2 or Stable Diffusion, those models also have other parts that PFGMs don't (like text encoding networks), so it is completely fair that they are larger!

4

SleekEagle OP t1_iu5h5tt wrote

I'm not sure how the curse of dimensionality would affect PFGMs relative to Diffusion Models, but at the very least PFGMs could be dropped in as the base model in Imagen while diffusion models are kept for the super resolution chain! More info on that here or more info on Imagen here (or how to build your own Imagen here ;) ).

2

Llort_Ruetama t1_iu5wd9x wrote

Reading the title made me realize how insane it is, that we're able to pass electricity through sand in such a way that it generates art (AI Generated art)

7

HydrousIt t1_iu6m0g8 wrote

A flow model has more VRAM efficiency and is quicker at image generation, although this is sometimes at the cost of having an inferior image quality to GANs in terms of realism.

1

HydrousIt t1_iu6mdk9 wrote

A flow model is a type of generative AI. It is a method of unsupervised learning, meaning no labels are used for the prediction. A flow model uses a "flow" model, similar to the flow of water, to generate data from an assumed distribution. They are less VRAM intensive and faster to generate images, even though GANs are generally more realistic, with more details in the generated images. Anyone feel free to correct me and also ask more

8

SleekEagle OP t1_iued3mr wrote

To generate data, you need to know the probability distribution of a dataset. This is in general unknown. The method called "normalizing flows" starts with a simple distribution that we do know exactly, and learns how to turn the simple distribution into the data distribution through a series of transformations. If we know these transformations, then we can generate data from the data distribution by sampling from the simple distribution and passing it through the transformations.

Normalizing flows are a general approach to generative AI - how to actually learn the transformations and what they look like depends on the particular method. With PFGMs, the authors find that the laws of physics define these transformations. If we start with a simple distribution, we can transform it into the data distribution by imagining the data points are electrons and moving them according to the electric field they generate.

2