Submitted by manli29 t3_yb851q in deeplearning

Has anyone tried training a model with two GANs one after another. I realize that it'll be computationally heavy but I wondering if it gives good results. Do let me know. TIA.

0

Comments

You must log in or register to comment.

TheRealSerdra t1_itfiwui wrote

What exactly do you want to do that requires two GANs? And are you planning on just chaining the generators?

1

Yeinstein20 t1_itfpan2 wrote

I feel like I've read a paper where they do something similar to this but I'm not completely sure. I'll try finding it.

Edit: maybe remind me of that in case I forget about it

1

kaarrrlll t1_itfyuqj wrote

Having two gans is not a problem. Although for different purpose it exists since a long time (cyclegan). What's important is that your loss and/or other constraints must be precise to make sure to avoid one gan learns both and other learns identity. It's also double the concerns for instability during training. Good luck!

2

beingsubmitted t1_itht9o5 wrote

I don't see how you would train them that way - you can't use the output of a discriminator as the input of a generator - that wouldn't get you what you want. You could train them in parallel, one network and discriminator doing only b&w restoration, and the other doing only colorization.

The way images work and the eye (part of the science behind why jpeg is so useful) is that we're much more sensitive to luminance information than color information. You could take the output of colorized image in hsl color space and replace the luminance with that of the generated restored photo. Doing it this way, you could force the separation of two generators using only one discriminator, as well - one generator only affecting the hue and saturation of the final image, and the other only affecting the luminance.

That said, with the more recent breakthroughs, it seems that networks are proving more successful as generalists than specialists. For example, it's believed that whisper performs better on each language because it's trained on all languages, as counter-intuitive as it may seem.

1