Skip to content

Latest commit

 

History

History
46 lines (33 loc) · 1.57 KB

README.md

File metadata and controls

46 lines (33 loc) · 1.57 KB

ARAE

tensorflow implementation of Adversarially Regularized Autoencoders for Generating Discrete Structures (ARAE)

While the Paper used the Stanford Natural Language Inference dataset for the text generation, this implementation only used the mnist dataset. I implemented the continuous version for this implementaion, but discrete version is implemented as footnote in this code.

Dependencies

  1. tensorflow == 1.0.0
  2. numpy == 1.12.0
  3. matplotlib == 1.3.1

Steps

Run the following code for image reconstruction.


python train.py

Results

  • The model trained 100000 steps

  • Generated from fake(noise) data

The result from noise tend to appear several figures simultaneously.
  • Generated from real data

Notes

I didn't multiply the critic gradient before backpropping to the encoder.