Introduction:

The key issue with generative task is about deciding what a good cost function should be? GAN(Generative Adversarial Networks) introduces two networks to solve that. The generator network creates fake samples, and Discriminator network distinguishes them from real samples.

GAN has been predominantly applied in image augmentation. GAN is particularly good at generating continuous samples. Due to this reason, it can’t be used directly for text generation (as it's sequence of discrete numbers.).

This talk will cover the recent breakthroughs in applying adversarial networks for language generation.

 
1 favorite thumb_down thumb_up 2 comments visibility_off  Remove from Watchlist visibility  Add to Watchlist
 

Outline/Structure of the Talk

Major developments in Adversarial network for text generation:

The talk will focus on the following recent advancements in GAN for natural language generation tasks.

  1. SeqGAN: Policy gradient Reinforcement learning methods
  2. LeakGAN: Long text generation with leaked information
  3. Re-parameterization trick for latent variables

Application: Following tasks will be covered in next section :

  • GAN for Machine Translation
  • GAN for Dialogue Generation
  • GAN for Style transfer

Demo : demonstration of language generation application with code.

Learning Outcome

Learn state of the art in natural language generation and it's practical application .

Target Audience

Data Scientist, Researcher, Students

Prerequisites for Attendees

Audience basic familiarity with deep learning and natural language processing.

schedule Submitted 1 month ago

Public Feedback

comment Suggest improvements to the Speaker
  • Dr. Vikas Agrawal
    By Dr. Vikas Agrawal  ~  1 month ago
    reply Reply

    Dear Rajib: This is a great topic. Could you please consider uploading a video of your previous talks or an intro the the topic as well? Warm Regards, Vikas

    • Rajib Biswas
      By Rajib Biswas  ~  3 weeks ago
      reply Reply

       Hi  Vikas, the video has been uploaded. Thanks for the comment.