1.3.4
This version release comes with several new features, alongside a significant push for better documentation, examples, and unit testing.
ed.KLqp
's score function gradient now does more intelligent (automatic) Rao-Blackwellization for variance reduction.- Automated transformations are enabled for all inference algorithms that benefit from it [tutorial].
- Added Wake-Sleep algorithm (
ed.WakeSleep
). - Many minor bug fixes.
Examples
- All Edward examples now rely on the Observations library for data loading (no "official" public release yet; still in alpha).
- Added LSTM language model for text8. (
examples/lstm.py
) - Added deep exponential family for modeling topics in NIPS articles. (
examples/deep_exponential_family.py
) - Added sigmoid belief network for Caltech-101 silhouettes. (
examples/sigmoid_belief_network.py
) - Added stochastic blockmodel on Karate club. (
examples/stochastic_block_model.py
) - Added Cox process on synthetic spatial data. (
examples/cox_process.py
)
Documentation & Testing
- Sealed all undocumented functions and modules in Edward.
- Parser and BibTeX to auto-generate API docs.
- Added unit testing to (most) all Jupyter notebooks.
Acknowledgements
- Thanks go to Matthew Feickert (@matthewfeickert), Alp Kucukelbir (@akucukelbir), Romain Lopez (@romain-lopez), Emile Mathieu (@emilemathieu), Stephen Ra (@stephenra), Kashif Rasul (@kashif), Philippe Rémy (@philipperemy), Charles Shenton (@cshenton), Yuto Yamaguchi (@yamaguchiyuto), @evahlis, @samnolen, @seiyab.
We are also grateful to all who filed issues or helped resolve them, asked and answered questions, and were part of inspiring discussions.