Train your GANs Faster and Achive Better Results
GANs have been in trend since they were introduced, back in 2014 and have also produced some very amazing results in every domain ranging from images to videos and even audios.
When reading and understanding about the working of GANs, they seem very intuitive and not that hard to train. It is when you get into training them you realize that its quite hard to train and achieve good results with your GAN architecture, and I learnt this the hard way.
I trained my first ever GAN as part of a contest on Kaggle, wherein the task was to generate new unseen images of dogs using the given 20,000 images. I gladly entered the competition thinking how hard it could be. But as I trained my first model and analyzed the results I realized that its not as simple as it looks, and as I progressed through the competition, I participated in various discussions, read the kernels submitted by others and tried out various approaches. This taught me a lot about training GANs the right ways. I have trained various GANs on several datasets since then.
So in this talk I want to share the tips and tricks that worked for me in achieving good results so you can directly use them and not have to learn them the hard way as I did.
Outline/Structure of the Experience Report
I plan to cover the following topics in my talk:
- Background Story and information about the Kaggle contest (2 mins)
- Quick Introduction to GANs (2 mins)
- Types of convolutions used in GANs and their effects on final results (1 min)
- Effect of the shape of input noise vector (1 min)
- Effect of Batch Size during training (2 mins)
- Various loss functions (2 mins)
- Mode Collapse and ways to prevent it (4 mins)
- Activation Functions (2 mins)
- Optimizers (1 min)
- Other general tips and tricks (1 min)
- Question / Answers (2 mins)
At the end of this talk attendees will have a better intuition of what can go wrong while training GANs, how to prevent it, how changing the parameters will change the final results and how to achieve the desired results quickly.
Hopefully the participants will get motivated to use GANs for new creative applications and also to participate in Kaggle contests. After the talk when the participants will see some amazing work published in a research paper using GANs, they'll know how much hard work the researchers have put in to achieve those results and would get curious to know how they achieved those results.
Anyone who have knowledge about the basic working of GANs or have worked with Neural Networks in any language or framework
Prerequisites for Attendees
To get the best out of the talk it would be better if attendees have a basic understanding of how GANs work, although its not necessary.