From a59d33421cc90c651a148cba2f0d2c3d6920d332 Mon Sep 17 00:00:00 2001 From: "Haoyu (Daniel) YANG" Date: Sun, 22 Sep 2024 22:28:56 +0800 Subject: [PATCH] Change `authors` to `list[dict['name' | 'url' | ..., str]]` (#70) * insert shebang and script check * pre-commit run * convert to dict of name and url * convert more to dict * convert publications * remove comma * Update data/packages.yml Co-authored-by: Meesum Qazalbash * fix __main__ loop and define class Author(TypedDict) TODO author validation * make script path absolute * revert reposition of module level vars * reapply somehow disappeared abs dir change * raise ValueError on non-https author URLs fix yaml whitespace --------- Co-authored-by: Meesum Qazalbash Co-authored-by: Janosh Riebesell --- data/applications.yml | 68 +++++++-- data/make_readme.py | 201 ++++++++++++++----------- data/packages.yml | 77 ++++++---- data/posts.yml | 17 ++- data/publications.yml | 336 ++++++++++++++++++++++++++++++++++-------- data/repos.yml | 72 +++++---- data/videos.yml | 36 +++-- readme.md | 4 +- 8 files changed, 577 insertions(+), 234 deletions(-) mode change 100644 => 100755 data/make_readme.py diff --git a/data/applications.yml b/data/applications.yml index 0bbade8..56eb4c0 100644 --- a/data/applications.yml +++ b/data/applications.yml @@ -1,47 +1,97 @@ - title: Latent Space Policies for Hierarchical Reinforcement Learning url: https://arxiv.org/abs/1804.02808 date: 2018-04-09 - authors: Tuomas Haarnoja, Kristian Hartikainen, Pieter Abbeel, Sergey Levine + authors: + - name: Tuomas Haarnoja + - name: Kristian Hartikainen + - name: Pieter Abbeel + - name: Sergey Levine description: Uses normalizing flows, specifically RealNVPs, as policies for reinforcement learning and also applies them for the hierarchical reinforcement learning setting. - title: Analyzing Inverse Problems with Invertible Neural Networks url: https://arxiv.org/abs/1808.04730 date: 2018-08-14 - authors: Lynton Ardizzone, Jakob Kruse, Sebastian Wirkert, Daniel Rahner, Eric W. Pellegrini, Ralf S. Klessen, Lena Maier-Hein, Carsten Rother, Ullrich Köthe + authors: + - name: Lynton Ardizzone + - name: Jakob Kruse + - name: Sebastian Wirkert + - name: Daniel Rahner + - name: Eric W. Pellegrini + - name: Ralf S. Klessen + - name: Lena Maier-Hein + - name: Carsten Rother + - name: Ullrich Köthe description: Normalizing flows for inverse problems. - title: NeuTra-lizing Bad Geometry in Hamiltonian Monte Carlo Using Neural Transport url: https://arxiv.org/abs/1903.03704 date: 2019-03-09 - authors: Matthew Hoffman, Pavel Sountsov, Joshua V. Dillon, Ian Langmore, Dustin Tran, Srinivas Vasudevan + authors: + - name: Matthew Hoffman + - name: Pavel Sountsov + - name: Joshua V. Dillon + - name: Ian Langmore + - name: Dustin Tran + - name: Srinivas Vasudevan description: Uses normalizing flows in conjunction with Monte Carlo estimation to have more expressive distributions and better posterior estimation. -- title: 'SRFlow: Learning the Super-Resolution Space with Normalizing Flow' +- title: "SRFlow: Learning the Super-Resolution Space with Normalizing Flow" url: https://arxiv.org/abs/2006.14200 date: 2020-06-25 - authors: Andreas Lugmayr, Martin Danelljan, Luc Van Gool, Radu Timofte + authors: + - name: Andreas Lugmayr + - name: Martin Danelljan + - name: Luc Van Gool + - name: Radu Timofte description: Uses normalizing flows for super-resolution. - title: Faster Uncertainty Quantification for Inverse Problems with Conditional Normalizing Flows url: https://arxiv.org/abs/2007.07985 date: 2020-07-15 - authors: Ali Siahkoohi, Gabrio Rizzuti, Philipp A. Witte, Felix J. Herrmann + authors: + - name: Ali Siahkoohi + - name: Gabrio Rizzuti + - name: Philipp A. Witte + - name: Felix J. Herrmann description: Uses conditional normalizing flows for inverse problems. [[Video](https://youtu.be/nPvZIKaRBkI)] - title: Targeted free energy estimation via learned mappings url: https://aip.scitation.org/doi/10.1063/5.0018903 date: 2020-10-13 - authors: Peter Wirnsberger, Andrew J. Ballard, George Papamakarios, Stuart Abercrombie, Sébastien Racanière, Alexander Pritzel, Danilo Jimenez Rezende, Charles Blundell + authors: + - name: Peter Wirnsberger + - name: Andrew J. Ballard + - name: George Papamakarios + - name: Stuart Abercrombie + - name: Sébastien Racanière + - name: Alexander Pritzel + - name: Danilo Jimenez Rezende + - name: Charles Blundell description: Normalizing flows used to estimate free energy differences. - title: On the Sentence Embeddings from Pre-trained Language Models url: https://aclweb.org/anthology/2020.emnlp-main.733 date: 2020-11-02 - authors: Bohan Li, Hao Zhou, Junxian He, Mingxuan Wang, Yiming Yang, Lei Li + authors: + - name: Bohan Li + - name: Hao Zhou + - name: Junxian He + - name: Mingxuan Wang + - name: Yiming Yang + - name: Lei Li description: Proposes to use flows to transform anisotropic sentence embedding distributions from BERT to a smooth and isotropic Gaussian, learned through unsupervised objective. Demonstrates performance gains over SOTA sentence embeddings on semantic textual similarity tasks. Code available at . - title: Normalizing Kalman Filters for Multivariate Time Series Analysis url: https://assets.amazon.science/ea/0c/88b7bdd54eae8c08983fa4cc3e06/normalizing-kalman-filters-for-multivariate-time-series-analysis.pdf date: 2020-12-06 - authors: Emmanuel de Bézenac, Syama Sundar Rangapuram, Konstantinos Benidis, Michael Bohlke-Schneider, Richard Kurle, Lorenzo Stella, Hilaf Hasson, Patrick Gallinari, Tim Januschowski + authors: + - name: Emmanuel de Bézenac + - name: Syama Sundar Rangapuram + - name: Konstantinos Benidis + - name: Michael Bohlke-Schneider + - name: Richard Kurle + - name: Lorenzo Stella + - name: Hilaf Hasson + - name: Patrick Gallinari + - name: Tim Januschowski description: Augments state space models with normalizing flows and thereby mitigates imprecisions stemming from idealized assumptions. Aimed at forecasting real-world data and handling varying levels of missing data. (Also available at [Amazon Science](https://amazon.science/publications/normalizing-kalman-filters-for-multivariate-time-series-analysis).) diff --git a/data/make_readme.py b/data/make_readme.py old mode 100644 new mode 100755 index 33bb1f5..3a82632 --- a/data/make_readme.py +++ b/data/make_readme.py @@ -1,25 +1,36 @@ +#!/usr/bin/env python3 + """Script to generate readme.md from data/*.yml files.""" import datetime +import os import re -from os.path import dirname from typing import TypedDict import yaml -ROOT = dirname(dirname(__file__)) +ROOT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) + + +class Author(TypedDict): + """An author of a paper or application.""" + + name: str + url: str | None + affiliation: str | None + github: str | None + orcid: str | None class Item(TypedDict): """An item in a readme section like a paper or package.""" title: str - authors: str + authors: list[Author] date: datetime.date lang: str url: str description: str - authors_url: str | None repo: str | None date_added: datetime.date | None @@ -44,7 +55,7 @@ class Section(TypedDict): def load_items(key: str) -> list[Item]: """Load list[Item] from YAML file.""" - with open(f"{ROOT}/data/{key}.yml", encoding="utf8") as file: + with open(f"{ROOT_DIR}/data/{key}.yml", encoding="utf8") as file: return yaml.safe_load(file.read()) @@ -53,10 +64,9 @@ def load_items(key: str) -> list[Item]: for key in titles # markdown is set below } - seen_titles: set[tuple[str, str]] = set() required_keys = {"title", "url", "date", "authors", "description"} -optional_keys = {"authors_url", "lang", "repo", "docs", "date_added", "last_updated"} +optional_keys = {"lang", "repo", "docs", "date_added", "last_updated"} valid_languages = {"PyTorch", "TensorFlow", "JAX", "Julia", "Other"} et_al_after = 2 @@ -72,7 +82,7 @@ def validate_item(itm: Item, section_title: str) -> None: else: seen_titles.add((title, section_title)) - if section_title in ("packages", "repos") and itm["lang"] not in valid_languages: + if section_title in {"packages", "repos"} and itm["lang"] not in valid_languages: errors += [ f"Invalid lang in {title}: {itm['lang']}, must be one of {valid_languages}" ] @@ -101,87 +111,100 @@ def validate_item(itm: Item, section_title: str) -> None: raise ValueError("\n".join(errors)) -for key, section in sections.items(): - # Keep lang_names inside sections loop to refill language subsections for each new - # section. Used by both repos and Packages. Is a list for order and mutability. - lang_names = ["PyTorch", "TensorFlow", "JAX", "Julia", "Other"] - - # sort first by language with order determined by lang_names (only applies to - # Package and repos sections), then by date - section["items"].sort(key=lambda x: x["date"], reverse=True) - if key in ("packages", "repos"): - section["items"].sort(key=lambda itm: lang_names.index(itm["lang"])) - - # add item count after section title - section["markdown"] += f" ({len(section['items'])})\n\n" - - for itm in section["items"]: - if (lang := itm.get("lang")) in lang_names: - lang_names.remove(lang) - # print language subsection title if this is the first item with that lang - section["markdown"] += ( - f'
\n\n### {lang}  {lang} {key.title()}\n\n' +if __name__ == "__main__": + for key, section in sections.items(): + # Keep lang_names inside sections loop to refill language + # subsections for each new section. Used by both repos and Packages. + # Is a list for order and mutability. + lang_names = ["PyTorch", "TensorFlow", "JAX", "Julia", "Other"] + + # sort first by language with order determined by lang_names (only applies to + # Package and repos sections), then by date + section["items"].sort(key=lambda x: x["date"], reverse=True) + if key in ("packages", "repos"): + section["items"].sort(key=lambda itm: lang_names.index(itm["lang"])) + + # add item count after section title + section["markdown"] += f" ({len(section['items'])})\n\n" + + for itm in section["items"]: + if (lang := itm.get("lang")) in lang_names: + lang_names.remove(lang) + # print language subsection title if this is the first item + # with that language + section["markdown"] += ( + f'
\n\n### {lang}  {lang} {key.title()}\n\n' + ) + + validate_item(itm, section["title"]) + + authors = itm["authors"] + date = itm["date"] + description = itm["description"] + title = itm["title"] + url = itm["url"] + + if key in ("publications", "applications"): + # only show people's last name for papers + authors = [ + auth | {"name": auth["name"].split(" ")[-1]} for auth in authors + ] + + def auth_str(auth: Author) -> str: + """Return a markdown string for an author.""" + auth_str = auth["name"] + if url := auth.get("url"): + if not url.startswith("https://"): + raise ValueError( + f"Invalid author {url=}, must start with https://" + ) + auth_str = f"[{auth_str}]({url})" + return auth_str + + authors_str = ", ".join(map(auth_str, authors[:et_al_after])) + if len(authors) > et_al_after: + authors_str += " et al." + + md_str = f"1. {date} - [{title}]({url}) by {authors_str}" + + if key in ("packages", "repos") and url.startswith("https://github.com"): + gh_login, repo_name = url.split("/")[3:5] + md_str += ( + f'\n \nGitHub repo stars' + ) + + md_str += "
\n " + description.removesuffix("\n") + if docs := itm.get("docs"): + md_str += f" [[Docs]({docs})]" + if repo := itm.get("repo"): + md_str += f" [[Code]({repo})]" + + section["markdown"] += md_str + "\n\n" + + with open(f"{ROOT_DIR}/readme.md", "r+", encoding="utf8") as file: + readme = file.read() + + for section in sections.values(): + # look ahead without matching + section_start_pat = f"(?<={section['title']})" + # look behind without matching + next_section_pat = "(?=
\n\n## )" + + # match everything up to next heading + readme = re.sub( + rf"{section_start_pat}[\s\S]+?\n\n{next_section_pat}", + section["markdown"], + readme, ) - validate_item(itm, section["title"]) - - authors = itm["authors"] - date = itm["date"] - description = itm["description"] - title = itm["title"] - url = itm["url"] - - author_list = authors.split(", ") - if key in ("publications", "applications"): - # only show people's last name for papers - author_list = [author.split(" ")[-1] for author in author_list] - authors = ", ".join(author_list[:et_al_after]) - if len(author_list) > et_al_after: - authors += " et al." - - if authors_url := itm.get("authors_url"): - authors = f"[{authors}]({authors_url})" - - md_str = f"1. {date} - [{title}]({url}) by {authors}" - - if key in ("packages", "repos") and url.startswith("https://github.com"): - gh_login, repo_name = url.split("/")[3:5] - md_str += ( - f'\n \nGitHub repo stars' - ) - - md_str += "
\n " + description.removesuffix("\n") - if docs := itm.get("docs"): - md_str += f" [[Docs]({docs})]" - if repo := itm.get("repo"): - md_str += f" [[Code]({repo})]" - - section["markdown"] += md_str + "\n\n" - - -with open(f"{ROOT}/readme.md", "r+", encoding="utf8") as file: - readme = file.read() - - for section in sections.values(): - # look ahead without matching - section_start_pat = f"(?<={section['title']})" - # look behind without matching - next_section_pat = "(?=
\n\n## )" - - # match everything up to next heading - readme = re.sub( - rf"{section_start_pat}[\s\S]+?\n\n{next_section_pat}", - section["markdown"], - readme, - ) - - file.seek(0) - file.write(readme) - file.truncate() + file.seek(0) + file.write(readme) + file.truncate() -section_counts = "\n".join( - f"- {key}: {len(sec['items'])}" for key, sec in sections.items() -) -print(f"finished writing {len(seen_titles)} items to readme:\n{section_counts}") # noqa: T201 + section_counts = "\n".join( + f"- {key}: {len(sec['items'])}" for key, sec in sections.items() + ) + print(f"finished writing {len(seen_titles)} items to readme:\n{section_counts}") # noqa: T201 diff --git a/data/packages.yml b/data/packages.yml index 504269b..cccc1ba 100644 --- a/data/packages.yml +++ b/data/packages.yml @@ -1,71 +1,81 @@ - title: FrEIA date: 2018-09-07 url: https://github.com/VLL-HD/FrEIA - authors: VLL Heidelberg - authors_url: https://hci.iwr.uni-heidelberg.de/vislearn + authors: + - name: VLL Heidelberg + url: https://hci.iwr.uni-heidelberg.de/vislearn lang: PyTorch description: The Framework for Easily Invertible Architectures (FrEIA) is based on RNVP flows. Easy to setup, it allows to define complex Invertible Neural Networks (INNs) from simple invertible building blocks. - title: nflows date: 2020-02-09 url: https://github.com/bayesiains/nflows - authors: Bayesiains - authors_url: https://homepages.inf.ed.ac.uk/imurray2/group + authors: + - name: Bayesiains + url: https://homepages.inf.ed.ac.uk/imurray2/group lang: PyTorch description: A suite of most of the SOTA methods using PyTorch. From an ML group in Edinburgh. They created the current SOTA spline flows. Almost as complete as you'll find from a single repo. - title: flowtorch date: 2020-12-07 url: https://github.com/facebookincubator/flowtorch - authors: Facebook / Meta - authors_url: https://opensource.fb.com + authors: + - name: Facebook / Meta + url: https://opensource.fb.com lang: PyTorch description: FlowTorch is a PyTorch library for learning and sampling from complex probability distributions using Normalizing Flows. - title: TensorFlow Probability date: 2018-06-22 url: https://github.com/tensorflow/probability - authors: Google - authors_url: https://tensorflow.org/probability + authors: + - name: Google + url: https://tensorflow.org/probability lang: TensorFlow description: Large first-party library that offers RNVP, MAF among other autoregressive models plus a collection of composable bijectors. - title: NuX date: 2020-03-09 url: https://github.com/Information-Fusion-Lab-Umass/NuX - authors: Information Fusion Labs (UMass) + authors: + - name: Information Fusion Labs (UMass) lang: JAX description: A library that offers normalizing flows using JAX as the backend. Has some SOTA methods. They also feature a surjective flow via quantization. - title: jax-flows date: 2020-03-23 url: https://github.com/ChrisWaites/jax-flows - authors: Chris Waites - authors_url: https://chriswaites.com + authors: + - name: Chris Waites + url: https://chriswaites.com lang: JAX description: Another library that has normalizing flows using JAX as the backend. Has some of the SOTA methods. - title: Distrax date: 2021-04-12 url: https://github.com/deepmind/distrax - authors: DeepMind - authors_url: https://deepmind.com + authors: + - name: DeepMind + url: https://deepmind.com + github: https://github.com/google-deepmind lang: JAX description: Distrax is a lightweight library of probability distributions and bijectors. It acts as a JAX-native re-implementation of a subset of TensorFlow Probability (TFP), with some new features and emphasis on extensibility. - title: pzflow date: 2021-06-17 url: https://github.com/jfcrenshaw/pzflow - authors: John Franklin Crenshaw - authors_url: https://jfcrenshaw.github.io + authors: + - name: John Franklin Crenshaw + url: https://jfcrenshaw.github.io lang: JAX description: A package that focuses on probabilistic modeling of tabular data, with a focus on sampling and posterior calculation. - title: InvertibleNetworks.jl date: 2020-02-07 url: https://github.com/slimgroup/InvertibleNetworks.jl - authors: SLIM - authors_url: https://slim.gatech.edu + authors: + - name: SLIM + url: https://slim.gatech.edu lang: Julia description: A Flux compatible library implementing invertible neural networks and normalizing flows using memory-efficient backpropagation. Uses manually implemented gradients to take advantage of the invertibility of building blocks, which allows for scaling to large-scale problem sizes. @@ -74,8 +84,9 @@ date_added: 2022-10-19 last_updated: 2022-10-19 url: https://github.com/francois-rozet/zuko - authors: François Rozet - authors_url: https://francois-rozet.github.io + authors: + - name: François Rozet + url: https://francois-rozet.github.io lang: PyTorch description: | Zuko is a Python package that implements normalizing flows in PyTorch. It relies heavily on PyTorch's built-in distributions and transformations, which makes the implementation concise, easy to understand and extend. The API is fully documented with references to the original papers. @@ -85,8 +96,9 @@ - title: Jammy Flows date: 2021-01-25 url: https://github.com/thoglu/jammy_flows - authors: Thorsten Glüsenkamp - authors_url: https://github.com/thoglu + authors: + - name: Thorsten Glüsenkamp + url: https://github.com/thoglu lang: PyTorch description: A package that models joint (conditional) PDFs on tensor products of manifolds (Euclidean, sphere, interval, simplex) - like inverse autoregressive flows, but connects manifolds, models conditional PDFs, and allows for arbitrary couplings instead of affine ones. Includes a few SOTA flows like Gaussianization flows. date_added: 2022-10-13 @@ -94,8 +106,9 @@ - title: normflows date: 2020-01-28 url: https://github.com/VincentStimper/normalizing-flows - authors: Vincent Stimper - authors_url: https://github.com/VincentStimper + authors: + - name: Vincent Stimper + url: https://github.com/VincentStimper lang: PyTorch description: The library provides most of the common normalizing flow architectures. It also includes stochastic layers, flows on tori and spheres, and other tools that are particularly useful for applications to the physical sciences. date_added: 2022-12-21 @@ -105,8 +118,9 @@ date_added: 2022-12-05 last_updated: 2023-05-31 url: https://github.com/impICNF/ContinuousNormalizingFlows.jl - authors: Hossein Pourbozorg - authors_url: https://github.com/prbzrg + authors: + - name: Hossein Pourbozorg + url: https://github.com/prbzrg description: Implementations of Infinitesimal Continuous Normalizing Flows Algorithms in Julia. lang: Julia docs: https://impicnf.github.io/ContinuousNormalizingFlows.jl @@ -116,8 +130,9 @@ date_added: 2024-06-22 last_updated: 2024-06-22 url: https://github.com/kazewong/flowMC - authors: Kaze Wong - authors_url: https://www.kaze-wong.com/ + authors: + - name: Kaze Wong + url: https://www.kaze-wong.com/ lang: JAX docs: https://flowmc.readthedocs.io/en/main/ description: Normalizing-flow enhanced sampling package for probabilistic inference @@ -127,7 +142,13 @@ date_added: 2024-09-21 last_updated: 2024-09-21 url: https://github.com/gwkokab/gwkokab - authors: Meesum Qazalbash, Muhammad Zeeshan, Richard O'Shaughnessy + authors: + - name: Meesum Qazalbash + url: https://github.com/Qazalbash + - name: Muhammad Zeeshan + url: https://ccrg.rit.edu/user/muhammad.zeeshan + - name: Richard O'Shaughnessy + url: https://ccrgpages.rit.edu/~oshaughn/Richard_OShaughnessy/Home.html lang: JAX docs: https://gwkokab.readthedocs.io description: A JAX-based gravitational-wave population inference toolkit for parametric models diff --git a/data/posts.yml b/data/posts.yml index a4f39b9..54ce721 100644 --- a/data/posts.yml +++ b/data/posts.yml @@ -1,26 +1,30 @@ - title: Normalizing Flows Tutorial date: 2018-01-17 url: https://blog.evjang.com/2018/01/nf1.html - authors: Eric Jang + authors: + - name: Eric Jang description: | [Part 1](https://blog.evjang.com/2018/01/nf1.html): Distributions and Determinants. [Part 2](https://blog.evjang.com/2018/01/nf2.html): Modern Normalizing Flows. Lots of great graphics. - title: Normalizing Flows date: 2018-04-03 url: https://akosiorek.github.io/norm_flows - authors: Adam Kosiorek + authors: + - name: Adam Kosiorek description: Introduction to flows covering change of variables, planar flow, radial flow, RNVP and autoregressive flows like MAF, IAF and Parallel WaveNet. - title: Flow-based Deep Generative Models date: 2018-10-13 url: https://lilianweng.github.io/lil-log/2018/10/13/flow-based-deep-generative-models - authors: Lilian Weng + authors: + - name: Lilian Weng description: Covers change of variables, NICE, RNVP, MADE, Glow, MAF, IAF, WaveNet, PixelRNN. - title: Change of Variables for Normalizing Flows date: 2018-10-21 url: https://nealjean.com/ml/change-of-variables - authors: Neal Jean + authors: + - name: Neal Jean description: Short and simple explanation of change of variables theorem i.t.o. probability mass conservation. - title: Chapter on flows from the book 'Deep Learning for Molecules and Materials' @@ -30,5 +34,8 @@ # NF chapter was added 2020-12-06 in commit # https://github.com/whitead/dmol-book/commit/e6d0b3295b73184423ab3a331feba3edbc103c0a # file url: https://github.com/whitead/dmol-book/blob/master/dl/flows.ipynb - authors: Andrew White + authors: + - name: Andrew White + url: https://thewhitelab.org + github: https://github.com/whitead description: A nice introduction starting with the change of variables formula (aka flow equation), going on to cover some common bijectors and finishing with a code example showing how to fit the double-moon distribution with TensorFlow Probability. diff --git a/data/publications.yml b/data/publications.yml index 8d7d427..aa3138d 100644 --- a/data/publications.yml +++ b/data/publications.yml @@ -1,19 +1,29 @@ - title: "Iterative Gaussianization: from ICA to Random Rotations" url: https://arxiv.org/abs/1602.00229 date: 2011-04-01 - authors: Valero Laparra, Gustavo Camps-Valls, Jesús Malo + authors: + - name: Valero Laparra + - name: Gustavo Camps-Valls + - name: Jesús Malo description: Normalizing flows in the form of Gaussianization in an iterative format. Also shows connections to information theory. - title: Non-linear Independent Components Estimation url: https://arxiv.org/abs/1410.8516 date: 2014-10-30 - authors: Laurent Dinh, David Krueger, Yoshua Bengio + authors: + - name: Laurent Dinh + - name: David Krueger + - name: Yoshua Bengio description: Introduces the additive coupling layer (NICE) and shows how to use it for image generation and inpainting. - title: Masked Autoencoder for Distribution Estimation url: https://arxiv.org/abs/1502.03509 date: 2015-02-12 - authors: Mathieu Germain, Karol Gregor, Iain Murray, Hugo Larochelle + authors: + - name: Mathieu Germain + - name: Karol Gregor + - name: Iain Murray + - name: Hugo Larochelle description: | Introduces MADE, a feed-forward network that uses carefully constructed binary masks on its weights to control the precise flow of information through the network. The masks ensure that each output unit receives signals only from input units that come before it in some arbitrary order. Yet all outputs can be computed in a single pass. @@ -29,13 +39,18 @@ - title: Variational Inference with Normalizing Flows url: https://arxiv.org/abs/1505.05770 date: 2015-05-21 - authors: Danilo Rezende, Shakir Mohamed + authors: + - name: Danilo Rezende + - name: Shakir Mohamed description: They show how to go beyond mean-field variational inference by using flows to increase the flexibility of the variational family. - title: Density estimation using Real NVP url: https://arxiv.org/abs/1605.08803 date: 2016-05-27 - authors: Laurent Dinh, Jascha Sohl-Dickstein, Samy Bengio + authors: + - name: Laurent Dinh + - name: Jascha Sohl-Dickstein + - name: Samy Bengio description: | They introduce the affine coupling layer (RNVP), a major improvement in terms of flexibility over the additive coupling layer (NICE) with unit Jacobian while keeping a single-pass forward and inverse transformation for fast sampling and density estimation, respectively. @@ -49,20 +64,31 @@ - title: Improving Variational Inference with Inverse Autoregressive Flow url: https://arxiv.org/abs/1606.04934 date: 2016-06-15 - authors: Diederik P. Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, Max Welling + authors: + - name: Diederik P. Kingma + - name: Tim Salimans + - name: Rafal Jozefowicz + - name: Xi Chen + - name: Ilya Sutskever + - name: Max Welling description: Introduces inverse autoregressive flow (IAF), a new type of flow which scales well to high-dimensional latent spaces. repo: https://github.com/openai/iaf - title: Multiplicative Normalizing Flows for Variational Bayesian Neural Networks url: https://arxiv.org/abs/1703.01961 date: 2017-03-06 - authors: Christos Louizos, Max Welling + authors: + - name: Christos Louizos + - name: Max Welling description: They introduce a new type of variational Bayesian neural network that uses flows to generate auxiliary random variables which boost the flexibility of the variational family by multiplying the means of a fully-factorized Gaussian posterior over network parameters. This turns the usual diagonal covariance Gaussian into something that allows for multimodality and non-linear dependencies between network parameters. - title: Masked Autoregressive Flow for Density Estimation url: https://arxiv.org/abs/1705.07057 date: 2017-05-19 - authors: George Papamakarios, Theo Pavlakou, Iain Murray + authors: + - name: George Papamakarios + - name: Theo Pavlakou + - name: Iain Murray description: | Introduces MAF, a stack of autoregressive models forming a normalizing flow suitable for fast density estimation but slow at sampling. Analogous to Inverse Autoregressive Flow (IAF) except the forward and inverse passes are exchanged. Generalization of RNVP. @@ -76,338 +102,528 @@ - title: Sylvester Normalizing Flow for Variational Inference url: https://arxiv.org/abs/1803.05649 date: 2018-03-15 - authors: Rianne van den Berg, Leonard Hasenclever, Jakub M. Tomczak, Max Welling + authors: + - name: Rianne van den Berg + - name: Leonard Hasenclever + - name: Jakub M. Tomczak + - name: Max Welling description: Introduces Sylvester normalizing flows which remove the single-unit bottleneck from planar flows for increased flexibility in the variational posterior. - title: Neural Autoregressive Flows url: https://arxiv.org/abs/1804.00779 date: 2018-04-03 - authors: Chin-Wei Huang, David Krueger, Alexandre Lacoste, Aaron Courville + authors: + - name: Chin-Wei Huang + - name: David Krueger + - name: Alexandre Lacoste + - name: Aaron Courville description: Unifies and generalize autoregressive and normalizing flow approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. Also demonstrates that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions. repo: https://github.com/CW-Huang/NAF - title: Deep Density Destructors url: https://proceedings.mlr.press/v80/inouye18a.html date: 2018-07-03 - authors: David Inouye, Pradeep Ravikumar + authors: + - name: David Inouye + - name: Pradeep Ravikumar description: Normalizing flows but from an iterative perspective. Features a Tree-based density estimator. - title: "Glow: Generative Flow with Invertible 1x1 Convolutions" url: https://arxiv.org/abs/1807.03039 date: 2018-07-09 - authors: Diederik P. Kingma, Prafulla Dhariwal + authors: + - name: Diederik P. Kingma + - name: Prafulla Dhariwal description: They show that flows using invertible 1x1 convolution achieve high likelihood on standard generative benchmarks and can efficiently synthesize realistic-looking, large images. - title: "FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models" url: https://arxiv.org/abs/1810.01367 date: 2018-10-02 - authors: Will Grathwohl, Ricky T. Q. Chen, Jesse Bettencourt, Ilya Sutskever, David Duvenaud + authors: + - name: Will Grathwohl + - name: Ricky T. Q. Chen + - name: Jesse Bettencourt + - name: Ilya Sutskever + - name: David Duvenaud description: Uses Neural ODEs as a solver to produce continuous-time normalizing flows (CNF). - title: "FloWaveNet : A Generative Flow for Raw Audio" url: https://arxiv.org/abs/1811.02155 date: 2018-11-06 - authors: Sungwon Kim, Sang-gil Lee, Jongyoon Song, Jaehyeon Kim, Sungroh Yoon + authors: + - name: Sungwon Kim + - name: Sang-gil Lee + - name: Jongyoon Song + - name: Jaehyeon Kim + - name: Sungroh Yoon description: A flow-based generative model for raw audo synthesis. repo: https://github.com/ksw0306/FloWaveNet - title: Block Neural Autoregressive Flow url: https://arxiv.org/abs/1904.04676) date: 2019-04-09 - authors: Nicola De Cao, Ivan Titov, Wilker Aziz + authors: + - name: Nicola De Cao + - name: Ivan Titov + - name: Wilker Aziz description: Introduces (B-NAF), a more efficient probability density approximator. Claims to be competitive with other flows across datasets while using orders of magnitude fewer parameters. - title: Integer Discrete Flows and Lossless Compression url: https://arxiv.org/abs/1905.07376 date: 2019-05-17 - authors: Emiel Hoogeboom, Jorn W.T. Peters, Rianne van den Berg, Max Welling + authors: + - name: Emiel Hoogeboom + - name: Jorn W.T. Peters + - name: Rianne van den Berg + - name: Max Welling description: A normalizing flow to be used for ordinal discrete data. They introduce a flexible transformation layer called integer discrete coupling. - title: Graph Normalizing Flows url: https://arxiv.org/abs/1905.13177 date: 2019-05-30 date_added: 2020-05-28 - authors: Jenny Liu, Aviral Kumar, Jimmy Ba, Jamie Kiros, Kevin Swersky + authors: + - name: Jenny Liu + - name: Aviral Kumar + - name: Jimmy Ba + - name: Jamie Kiros + - name: Kevin Swersky description: A new, reversible graph network for prediction and generation. They perform similarly to message passing neural networks on supervised tasks, but at significantly reduced memory use, allowing them to scale to larger graphs. Combined with a novel graph auto-encoder for unsupervised learning, graph normalizing flows are a generative model for graph structures. - title: Noise Regularization for Conditional Density Estimation url: https://arxiv.org/abs/1907.08982 date: 2019-07-21 - authors: Jonas Rothfuss, Fabio Ferreira, Simon Boehm, Simon Walther, Maxim Ulrich, Tamim Asfour, Andreas Krause + authors: + - name: Jonas Rothfuss + - name: Fabio Ferreira + - name: Simon Boehm + - name: Simon Walther + - name: Maxim Ulrich + - name: Tamim Asfour + - name: Andreas Krause description: Normalizing flows for conditional density estimation. This paper proposes noise regularization to reduce overfitting. [[Blog](https://siboehm.com/articles/19/normalizing-flow-network)] - title: "Normalizing Flows: An Introduction and Review of Current Methods" url: https://arxiv.org/abs/1908.09257 date: 2019-08-25 - authors: Ivan Kobyzev, Simon J.D. Prince, Marcus A. Brubaker + authors: + - name: Ivan Kobyzev + - name: Simon J.D. Prince + - name: Marcus A. Brubaker description: Another very thorough and very readable review article going through the basics of NFs as well as some of the state-of-the-art. Also highly recommended. - title: Neural Spline Flows url: https://arxiv.org/abs/1906.04032 date: 2019-06-10 - authors: Conor Durkan, Artur Bekasov, Iain Murray, George Papamakarios + authors: + - name: Conor Durkan + - name: Artur Bekasov + - name: Iain Murray + - name: George Papamakarios description: Uses monotonic ration splines as a coupling layer. This is currently one of the state of the art. - title: Normalizing Flows for Probabilistic Modeling and Inference url: https://arxiv.org/abs/1912.02762 date: 2019-12-05 - authors: George Papamakarios, Eric Nalisnick, Danilo Jimenez Rezende, Shakir Mohamed, Balaji Lakshminarayanan + authors: + - name: George Papamakarios + - name: Eric Nalisnick + - name: Danilo Jimenez Rezende + - name: Shakir Mohamed + - name: Balaji Lakshminarayanan description: A thorough and very readable review article by some of the guys at DeepMind involved in the development of flows. Highly recommended. - title: Invertible Generative Modeling using Linear Rational Splines url: https://arxiv.org/abs/2001.05168 date: 2020-01-15 - authors: Hadi M. Dolatabadi, Sarah Erfani, Christopher Leckie + authors: + - name: Hadi M. Dolatabadi + - name: Sarah Erfani + - name: Christopher Leckie description: A successor to the Neural spline flows which features an easy-to-compute inverse. - title: Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification url: https://arxiv.org/abs/2001.06448 date: 2020-01-17 - authors: Lynton Ardizzone, Radek Mackowiak, Carsten Rother, Ullrich Köthe + authors: + - name: Lynton Ardizzone + - name: Radek Mackowiak + - name: Carsten Rother + - name: Ullrich Köthe description: They introduce a class of conditional normalizing flows with an information bottleneck objective. repo: https://github.com/VLL-HD/exact_information_bottleneck - title: Stochastic Normalizing Flows (SNF) url: https://arxiv.org/abs/2002.06707 date: 2020-02-16 - authors: Hao Wu, Jonas Köhler, Frank Noé + authors: + - name: Hao Wu + - name: Jonas Köhler + - name: Frank Noé description: Introduces SNF, an arbitrary sequence of deterministic invertible functions (the flow) and stochastic processes such as MCMC or Langevin Dynamics. The aim is to increase expressiveness of the chosen deterministic invertible function, while the trainable flow improves sampling efficiency over pure MCMC [[Tweet](https://twitter.com/FrankNoeBerlin/status/1229734899034329103)).] - title: Stochastic Normalizing Flows url: https://arxiv.org/abs/2002.09547 date: 2020-02-21 - authors: Liam Hodgkinson, Chris van der Heide, Fred Roosta, Michael W. Mahoney + authors: + - name: Liam Hodgkinson + - name: Chris van der Heide + - name: Fred Roosta + - name: Michael W. Mahoney description: "Name clash for a very different technique from the above SNF: an extension of continuous normalizing flows using stochastic differential equations (SDE). Treats Brownian motion in the SDE as a latent variable and approximates it by a flow. Aims to enable efficient training of neural SDEs which can be used for constructing efficient Markov chains." - title: Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows url: https://arxiv.org/abs/2002.10516 date: 2020-02-24 - authors: Ruizhi Deng, Bo Chang, Marcus A. Brubaker, Greg Mori, Andreas Lehrmann + authors: + - name: Ruizhi Deng + - name: Bo Chang + - name: Marcus A. Brubaker + - name: Greg Mori + - name: Andreas Lehrmann description: They propose a normalizing flow using differential deformation of the Wiener process. Applied to time series. [[Tweet](https://twitter.com/r_giaquinto/status/1309648804824723464)] - title: Gradient Boosted Normalizing Flows url: https://arxiv.org/abs/2002.11896 date: 2020-02-27 - authors: Robert Giaquinto, Arindam Banerjee + authors: + - name: Robert Giaquinto + - name: Arindam Banerjee description: Augment traditional normalizing flows with gradient boosting. They show that training multiple models can achieve good results and it's not necessary to have more complex distributions. repo: https://github.com/robert-giaquinto/gradient-boosted-normalizing-flows - title: Gaussianization Flows url: https://arxiv.org/abs/2003.01941 date: 2020-03-04 - authors: Chenlin Meng, Yang Song, Jiaming Song, Stefano Ermon + authors: + - name: Chenlin Meng + - name: Yang Song + - name: Jiaming Song + - name: Stefano Ermon description: Uses a repeated composition of trainable kernel layers and orthogonal transformations. Very competitive versus some of the SOTA like Real-NVP, Glow and FFJORD. repo: https://github.com/chenlin9/Gaussianization_Flows - title: Flows for simultaneous manifold learning and density estimation url: https://arxiv.org/abs/2003.13913 date: 2020-03-31 - authors: Johann Brehmer, Kyle Cranmer + authors: + - name: Johann Brehmer + - name: Kyle Cranmer description: Normalizing flows that learn the data manifold and probability density function on that manifold. [[Tweet](https://twitter.com/kylecranmer/status/1250129080395223040)] repo: https://github.com/johannbrehmer/manifold-flow - title: Normalizing Flows with Multi-Scale Autoregressive Priors url: https://arxiv.org/abs/2004.03891 date: 2020-04-08 - authors: Shweta Mahajan, Apratim Bhattacharyya, Mario Fritz, Bernt Schiele, Stefan Roth + authors: + - name: Shweta Mahajan + - name: Apratim Bhattacharyya + - name: Mario Fritz + - name: Bernt Schiele + - name: Stefan Roth description: Improves the representational power of flow-based models by introducing channel-wise dependencies in their latent space through multi-scale autoregressive priors (mAR). repo: https://github.com/visinf/mar-scf - title: "Equivariant Flows: exact likelihood generative learning for symmetric densities" url: https://arxiv.org/abs/2006.02425 date: 2020-06-03 - authors: Jonas Köhler, Leon Klein, Frank Noé + authors: + - name: Jonas Köhler + - name: Leon Klein + - name: Frank Noé description: Shows that distributions generated by equivariant NFs faithfully reproduce symmetries in the underlying density. Proposes building blocks for flows which preserve typical symmetries in physical/chemical many-body systems. Shows that symmetry-preserving flows can provide better generalization and sampling efficiency. - title: Why Normalizing Flows Fail to Detect Out-of-Distribution Data url: https://arxiv.org/abs/2006.08545 date: 2020-06-15 - authors: Polina Kirichenko, Pavel Izmailov, Andrew Gordon Wilson + authors: + - name: Polina Kirichenko + - name: Pavel Izmailov + - name: Andrew Gordon Wilson description: This study how traditional normalizing flow models can suffer from out-of-distribution data. They offer a solution to combat this issue by modifying the coupling layers. [[Tweet](https://twitter.com/polkirichenko/status/1272715634544119809)] repo: https://github.com/PolinaKirichenko/flows_ood - title: "SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows" url: https://arxiv.org/abs/2007.02731 date: 2020-07-06 - authors: Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling + authors: + - name: Didrik Nielsen + - name: Priyank Jaini + - name: Emiel Hoogeboom + - name: Ole Winther + - name: Max Welling description: They present a generalized framework that encompasses both Flows (deterministic maps) and VAEs (stochastic maps). By seeing deterministic maps `x = f(z)` as limiting cases of stochastic maps `x ~ p(x|z)`, the ELBO is reinterpreted as a change of variables formula for the stochastic maps. Moreover, they present a few examples of surjective layers using stochastic maps, which can be composed together with flow layers. [[Video](https://youtu.be/bXp8fk4MRXQ)] repo: https://github.com/didriknielsen/survae_flows - title: "AdvFlow: Inconspicuous Black-box Adversarial Attacks using Normalizing Flows" url: https://arxiv.org/abs/2007.07435 date: 2020-07-15 - authors: Hadi M. Dolatabadi, Sarah Erfani, Christopher Leckie + authors: + - name: Hadi M. Dolatabadi + - name: Sarah Erfani + - name: Christopher Leckie description: An adversarial attack method on image classifiers that use normalizing flows. repo: https://github.com/hmdolatabadi/AdvFlow - title: Haar Wavelet based Block Autoregressive Flows for Trajectories url: https://arxiv.org/abs/2009.09878 date: 2020-09-21 - authors: Apratim Bhattacharyya, Christoph-Nikolas Straehle, Mario Fritz, Bernt Schiele + authors: + - name: Apratim Bhattacharyya + - name: Christoph-Nikolas Straehle + - name: Mario Fritz + - name: Bernt Schiele description: Introduce a Haar wavelet-based block autoregressive model. - title: E(n) Equivariant Normalizing Flows url: https://arxiv.org/abs/2105.09016 date: 2022-01-14 - authors: Victor Garcia Satorras, Emiel Hoogeboom, Fabian B. Fuchs, Ingmar Posner, Max Welling + authors: + - name: Victor Garcia Satorras + - name: Emiel Hoogeboom + - name: Fabian B. Fuchs + - name: Ingmar Posner + - name: Max Welling description: Introduces equivariant graph neural networks into the normalizing flow framework which combine to give invertible equivariant functions. Demonstrates their flow beats prior equivariant models and allows sampling of molecular configurations with positions, atom types and charges. - title: Convolutional Normalizing Flows url: https://arxiv.org/abs/1711.02255 date: 2017-11-17 - authors: Guoqing Zheng, Yiming Yang, Jaime Carbonell + authors: + - name: Guoqing Zheng + - name: Yiming Yang + - name: Jaime Carbonell description: Introduces normalizing flows that take advantage of convolutions (based on convolution over the dimensions of random input vector) to improve the posterior in the variational inference framework. This also reduced the number of parameters due to the convolutions. - title: Emerging Convolutions for Generative Normalizing Flows url: https://arxiv.org/abs/1901.11137 date: 2019-01-30 - authors: Emiel Hoogeboom, Rianne van den Berg, Max Welling + authors: + - name: Emiel Hoogeboom + - name: Rianne van den Berg + - name: Max Welling description: Introduces autoregressive-like convolutional layers that operate on the channel **and** spatial axes. This improved upon the performance of image datasets compared to the standard 1x1 Convolutions. The trade-off is that the inverse operator is quite expensive however the authors provide a fast C++ implementation. repo: https://github.com/ehoogeboom/emerging - title: Fast Flow Reconstruction via Robust Invertible n x n Convolution url: https://arxiv.org/abs/1905.10170 date: 2019-05-24 - authors: Thanh-Dat Truong, Khoa Luu, Chi Nhan Duong, Ngan Le, Minh-Triet Tran + authors: + - name: Thanh-Dat Truong + - name: Khoa Luu + - name: Chi Nhan Duong + - name: Ngan Le + - name: Minh-Triet Tran description: Seeks to overcome the limitation of 1x1 convolutions and proposes invertible nxn convolutions via a clever convolutional _affine_ function. - title: "MaCow: Masked Convolutional Generative Flow" url: https://arxiv.org/abs/1902.04208 date: 2019-02-19 - authors: Xuezhe Ma, Xiang Kong, Shanghang Zhang, Eduard Hovy + authors: + - name: Xuezhe Ma + - name: Xiang Kong + - name: Shanghang Zhang + - name: Eduard Hovy description: Introduces a masked convolutional generative flow (MaCow) layer using a small kernel to capture local connectivity. They showed some improvement over the GLOW model while being fast and stable. - title: "iUNets: Fully invertible U-Nets with Learnable Upand Downsampling" url: https://arxiv.org/abs/2005.05220 date: 2020-05-11 - authors: Christian Etmann, Rihuan Ke, Carola-Bibiane Schönlieb + authors: + - name: Christian Etmann + - name: Rihuan Ke + - name: Carola-Bibiane Schönlieb description: Extends the classical UNet to be fully invertible by enabling invertible, orthogonal upsampling and downsampling layers. It is rather efficient so it should be able to enable stable training of deeper and larger networks. - title: The Convolution Exponential and Generalized Sylvester Flows url: https://arxiv.org/abs/2006.01910 date: 2020-06-02 - authors: Emiel Hoogeboom, Victor Garcia Satorras, Jakub M. Tomczak, Max Welling + authors: + - name: Emiel Hoogeboom + - name: Victor Garcia Satorras + - name: Jakub M. Tomczak + - name: Max Welling description: Introduces exponential convolution to add the spatial dependencies in linear layers as an improvement of the 1x1 convolutions. It uses matrix exponentials to create cheap and invertible layers. They also use this new architecture to create _convolutional Sylvester flows_ and _graph convolutional exponentials_. repo: https://github.com/ehoogeboom/convolution_exponential_and_sylvester - title: "CInC Flow: Characterizable Invertible 3x3 Convolution" url: https://arxiv.org/abs/2107.01358 date: 2021-07-03 - authors: Sandeep Nagar, Marius Dufraisse, Girish Varma + authors: + - name: Sandeep Nagar + - name: Marius Dufraisse + - name: Girish Varma description: Seeks to improve expensive convolutions. They investigate the conditions for when 3x3 convolutions are invertible under which conditions (e.g. padding) and saw successful speedups. Furthermore, they developed a more expressive, invertible _Quad coupling_ layer. repo: https://github.com/Naagar/Normalizing_Flow_3x3_inv - title: Orthogonalizing Convolutional Layers with the Cayley Transform url: https://arxiv.org/abs/2104.07167 date: 2021-04-14 - authors: Asher Trockman, J. Zico Kolter + authors: + - name: Asher Trockman + - name: J. Zico Kolter description: Parametrizes the multichannel convolution to be orthogonal via the Cayley transform (skew-symmetric convolutions in the Fourier domain). This enables the inverse to be computed efficiently. repo: https://github.com/locuslab/orthogonal-convolutions - title: Improving Normalizing Flows via Better Orthogonal Parameterizations url: https://invertibleworkshop.github.io/INNF_2019/accepted_papers/pdfs/INNF_2019_paper_30.pdf date: 2021-04-14 - authors: Adam Goliński, Mario Lezcano-Casado, Tom Rainforth + authors: + - name: Adam Goliński + - name: Mario Lezcano-Casado + - name: Tom Rainforth description: Parametrizes the 1x1 convolution via the exponential map and the Cayley map. They demonstrate an improved optimization for the Sylvester normalizing flows. - title: Invertible Convolutional Flow url: https://proceedings.neurips.cc/paper/2019/hash/b1f62fa99de9f27a048344d55c5ef7a6-Abstract.html date: 2019-06-15 - authors: Mahdi Karami, Dale Schuurmans, Jascha Sohl-Dickstein, Laurent Dinh, Daniel Duckworth + authors: + - name: Mahdi Karami + - name: Dale Schuurmans + - name: Jascha Sohl-Dickstein + - name: Laurent Dinh + - name: Daniel Duckworth description: Introduces convolutional layers that are circular and symmetric. The layer is invertible and cheap to evaluate. They also showcase how one can design non-linear elementwise bijectors that induce special properties via constraining the loss function. repo: https://github.com/Karami-m/Invertible-Convolutional-Flow - title: Invertible Convolutional Networks url: https://invertibleworkshop.github.io/INNF_2019/accepted_papers/pdfs/INNF_2019_paper_26.pdf date: 2019-06-15 - authors: Marc Finzi, Pavel Izmailov, Wesley Maddox, Polina Kirichenko, Andrew Gordon Wilson + authors: + - name: Marc Finzi + - name: Pavel Izmailov + - name: Wesley Maddox + - name: Polina Kirichenko + - name: Andrew Gordon Wilson description: Showcases how standard convolutional layers can be made invertible via Fourier transformations. They also introduce better activations which might be better suited to normalizing flows, e.g. SneakyRELU - title: "MintNet: Building Invertible Neural Networks with Masked Convolutions" url: https://arxiv.org/abs/1907.07945 date: 2019-07-18 - authors: Yang Song, Chenlin Meng, Stefano Ermon + authors: + - name: Yang Song + - name: Chenlin Meng + - name: Stefano Ermon description: Creates an autoregressive-like coupling layer via masked convolutions which is fast and efficient to evaluate. repo: https://github.com/ermongroup/mintnet - title: Densely connected normalizing flows url: https://arxiv.org/abs/2106.04627 date: 2019-07-18 - authors: Matej Grcić, Ivan Grubišić, Siniša Šegvić + authors: + - name: Matej Grcić + - name: Ivan Grubišić + - name: Siniša Šegvić description: Creates a nested coupling structure to add more expressivity to standard coupling layers. They also utilize slicing/factorization for dimensionality reduction and Nystromer for the coupling layer conditioning network. They achieved SOTA results for normalizing flow models. repo: https://github.com/matejgrcic/DenseFlow - title: Multi-scale Attention Flow for Probabilistic Time Series Forecasting url: https://arxiv.org/abs/2205.07493 date: 2022-05-16 - authors: Shibo Feng, Ke Xu, Jiaxiang Wu, Pengcheng Wu, Fan Lin, Peilin Zhao + authors: + - name: Shibo Feng + - name: Ke Xu + - name: Jiaxiang Wu + - name: Pengcheng Wu + - name: Fan Lin + - name: Peilin Zhao description: Proposes a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF), where one integrates multi-scale attention and relative position information and the multivariate data distribution is represented by the conditioned normalizing flow. repo: null - title: Multivariate Probabilistic Time Series Forecasting via Conditioned Normalizing Flows url: https://arxiv.org/abs/2002.06103 date: 2020-09-28 - authors: Kashif Rasul, Abdul-Saboor Sheikh, Ingmar Schuster, Urs M Bergmann, Roland Vollgraf + authors: + - name: Kashif Rasul + - name: Abdul-Saboor Sheikh + - name: Ingmar Schuster + - name: Urs M Bergmann + - name: Roland Vollgraf description: Models the multi-variate temporal dynamics of time series via an autoregressive deep learning model, where the data distribution is represented by a conditioned normalizing flow. [[OpenReview.net](https://openreview.net/forum?id=WiGQBFuVRv)] repo: https://github.com/zalandoresearch/pytorch-ts - title: "ManiFlow: Implicitly Representing Manifolds with Normalizing Flows" url: https://arxiv.org/abs/2208.08932 date: 2022-08-18 - authors: Janis Postels, Martin Danelljan, Luc Van Gool, Federico Tombari + authors: + - name: Janis Postels + - name: Martin Danelljan + - name: Luc Van Gool + - name: Federico Tombari description: The invertibility constraint of NFs imposes limitations on data distributions that reside on lower dimensional manifolds embedded in higher dimensional space. This is often bypassed by adding noise to the data which impacts generated sample quality. This work generates samples from the original data distribution given full knowledge of perturbed distribution and noise model. They establish NFs trained on perturbed data implicitly represent the manifold in regions of maximum likelihood, then propose an optimization objective that recovers the most likely point on the manifold given a sample from the perturbed distribution. - title: Unconstrained Monotonic Neural Networks url: https://arxiv.org/abs/1908.05164 date: 2019-09-14 - authors: Antoine Wehenkel, Gilles Louppe + authors: + - name: Antoine Wehenkel + - name: Gilles Louppe description: UMNN relaxes the constraints on weights and activation functions of monotonic neural networks by setting the derivative of the transformation as the output of an unconstrained neural network. The transformation itself is computed by numerical integration (Clenshaw-Curtis quadrature) of the derivative. repo: https://github.com/AWehenkel/UMNN - title: Graphical Normalizing Flows url: https://arxiv.org/abs/2006.02548 date: 2022-06-03 - authors: Antoine Wehenkel, Gilles Louppe + authors: + - name: Antoine Wehenkel + - name: Gilles Louppe description: This work revisits coupling and autoregressive transformations as probabilistic graphical models showing they reduce to Bayesian networks with a pre-defined topology. From this new perspective, the authors propose the graphical normalizing flow, a new invertible transformation with either a prescribed or a learnable graphical structure. This model provides a promising way to inject domain knowledge into normalizing flows while preserving both the interpretability of Bayesian networks and the representation capacity of normalizing flows. repo: https://github.com/AWehenkel/Graphical-Normalizing-Flows - title: Block Neural Autoregressive Flow url: https://arxiv.org/abs/1904.04676 date: 2019-04-09 - authors: Antoine Wehenkel, Gilles Louppe + authors: + - name: Antoine Wehenkel + - name: Gilles Louppe description: As an alternative to hand-crafted bijections, Huang et al. (2018) proposed NAF, a universal approximator for density functions. Their flow is a neural net whose parameters are predicted by another NN. The latter grows quadratically with the size of the former which is inefficient. We propose block neural autoregressive flow (B-NAF), a much more compact universal approximator of density functions, where we model a bijection directly using a single feed-forward network. Invertibility is ensured by carefully designing affine transformations with block matrices that make the flow autoregressive and monotone. We compare B-NAF to NAF and show our flow is competitive across datasets while using orders of magnitude fewer parameters. repo: https://github.com/nicola-decao/BNAF - title: "FInC Flow: Fast and Invertible k×k Convolutions for Normalizing Flows" url: https://arxiv.org/abs/2301.09266 date: 2023-01-03 - authors: Aditya Kallapa, Sandeep Nagar, Girish Varma + authors: + - name: Aditya Kallapa + - name: Sandeep Nagar + - name: Girish Varma description: propose a k×k convolutional layer and Deep Normalizing Flow architecture which i) has a fast parallel inversion algorithm with running time O(nk^2) (n is height and width of the input image and k is kernel size), ii) masks the minimal amount of learnable parameters in a layer. iii) gives better forward pass and sampling times comparable to other k×k convolution-based models on real-world benchmarks. We provide an implementation of the proposed parallel algorithm for sampling using our invertible convolutions on GPUs. repo: https://github.com/aditya-v-kallappa/FInCFlow - title: Invertible Monotone Operators for Normalizing Flows url: https://arxiv.org/abs/2210.08176 date: 2022-10-15 - authors: Byeongkeun Ahn, Chiyoon Kim, Youngjoon Hong, Hyunwoo J. Kim + authors: + - name: Byeongkeun Ahn + - name: Chiyoon Kim + - name: Youngjoon Hong + - name: Hyunwoo J. Kim description: This work proposes the monotone formulation to overcome the issue of the Lipschitz constants in previous ResNet-based normalizing flows using monotone operators and provides an in-depth theoretical analysis. Furthermore, this work constructs an activation function called Concatenated Pila (CPila) to improve gradient flow. The resulting model, Monotone Flows, exhibits an excellent performance on multiple density estimation benchmarks (MNIST, CIFAR-10, ImageNet32, ImageNet64). repo: https://github.com/mlvlab/MonotoneFlows - title: Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods url: https://arxiv.org/abs/2107.08001 date: 2021-07-16 - authors: Marylou Gabrié, Grant M. Rotskoff, Eric Vanden-Eijnden + authors: + - name: Marylou Gabrié + - name: Grant M. Rotskoff + - name: Eric Vanden-Eijnden description: Normalizing flows have potential in Bayesian statistics as a complementary or alternative method to MCMC for sampling posteriors. However, their training via reverse KL divergence may be inadequate for complex posteriors. This research proposes a new training approach utilizing direct KL divergence, which involves augmenting a local MCMC algorithm with a normalizing flow to enhance mixing rate and utilizing the resulting samples to train the flow. This method requires minimal prior knowledge of the posterior and can be applied for model validation and evidence estimation, offering a promising strategy for efficient posterior sampling. - title: Adaptive Monte Carlo augmented with normalizing flows url: https://doi.org/10.1073/pnas.2109420119 date: 2022-03-02 - authors: Marylou Gabrié, Grant M. Rotskoff, Eric Vanden-Eijnden + authors: + - name: Marylou Gabrié + - name: Grant M. Rotskoff + - name: Eric Vanden-Eijnden description: Markov Chain Monte Carlo (MCMC) algorithms struggle with sampling from high-dimensional, multimodal distributions, requiring extensive computational effort or specialized importance sampling strategies. To address this, an adaptive MCMC approach is proposed, combining local updates with nonlocal transitions via normalizing flows. This method blends standard transition kernels with generative model moves, adapting the generative model using generated data to improve sampling efficiency. Theoretical analysis and numerical experiments demonstrate the algorithm's ability to equilibrate quickly between metastable modes, sampling effectively across large free energy barriers and achieving significant accelerations over traditional MCMC methods. repo: https://zenodo.org/records/4783701#.Yfv53urMJD8 - title: Transferable Boltzmann Generators url: https://arxiv.org/abs/2406.14426 date: 2024-06-20 - authors: Leon Klein, Frank Noé + authors: + - name: Leon Klein + - name: Frank Noé description: Boltzmann Generators, a machine learning method, generate equilibrium samples of molecular systems by learning a transformation from a simple prior distribution to the target Boltzmann distribution via normalizing flows. Recently, flow matching has been used to train Boltzmann Generators for small systems in Cartesian coordinates. This work extends this approach by proposing a framework for transferable Boltzmann Generators that can predict Boltzmann distributions for unseen molecules without retraining. This allows for approximate sampling and efficient reweighting to the target distribution. The framework is tested on dipeptides, demonstrating efficient generalization to new systems and improved efficiency compared to single-system training. repo: https://osf.io/n8vz3/?view_only=1052300a21bd43c08f700016728aa96e diff --git a/data/repos.yml b/data/repos.yml index b59938d..5f2df07 100644 --- a/data/repos.yml +++ b/data/repos.yml @@ -1,29 +1,33 @@ - title: pytorch-flows date: 2018-09-01 url: https://github.com/ikostrikov/pytorch-flows - authors: Ilya Kostrikov + authors: + - name: Ilya Kostrikov lang: PyTorch description: "PyTorch implementations of density estimation algorithms: MAF, RNVP, Glow." - title: normalizing_flows date: 2018-12-30 url: https://github.com/kamenbliznashki/normalizing_flows - authors: Kamen Bliznashki + authors: + - name: Kamen Bliznashki lang: PyTorch description: "Pytorch implementations of density estimation algorithms: BNAF, Glow, MAF, RealNVP, planar flows." - title: pytorch_flows date: 2019-02-06 url: https://github.com/acids-ircam/pytorch_flows - authors: acids-ircam - authors_url: https://github.com/acids-ircam + authors: + - name: acids-ircam + url: https://github.com/acids-ircam lang: PyTorch description: A great repo with some basic PyTorch implementations of normalizing flows from scratch. - title: pytorch-normalizing-flows date: 2019-12-09 url: https://github.com/karpathy/pytorch-normalizing-flows - authors: Andrej Karpathy + authors: + - name: Andrej Karpathy lang: PyTorch description: "A Jupyter notebook with PyTorch implementations of the most commonly used flows: NICE, RNVP, MAF, Glow, NSF." @@ -31,67 +35,76 @@ date: 2020-07-03 url: https://git.io/JiWaG # Jupyter notebook 1 # Jupyter notebook 2: https://git.io/JiW2H - authors: torchdyn - authors_url: https://torchdyn.readthedocs.io + authors: + - name: torchdyn + url: https://torchdyn.readthedocs.io lang: PyTorch description: Example of how to use FFJORD as a continuous normalizing flow (CNF). Based on the PyTorch suite `torchdyn` which offers continuous neural architectures. - title: Normalizing Flows - Introduction (Part 1) date: 2020-07-19 url: https://pyro.ai/examples/normalizing_flows_i - authors: pyro.ai - authors_url: https://pyro.ai + authors: + - name: pyro.ai + url: https://pyro.ai lang: PyTorch description: A tutorial about how to use the `pyro-ppl` library (based on PyTorch) to use Normalizing flows. They provide some SOTA methods including NSF and MAF. [Parts 2 and 3 coming later](https://github.com/pyro-ppl/pyro/issues/1992). - title: "NICE: Non-linear Independent Components Estimation" date: 2021-08-21 url: https://github.com/MaximeVandegar/Papers-in-100-Lines-of-Code/tree/main/NICE_Non_linear_Independent_Components_Estimation - authors: Maxime Vandegar + authors: + - name: Maxime Vandegar lang: PyTorch description: PyTorch implementation that reproduces results from the paper NICE in about 100 lines of code. - title: Neural Transport date: 2020-06-12 url: https://pyro.ai/numpyro/examples/neutra - authors: numpyro - authors_url: https://num.pyro.ai + authors: + - name: numpyro + url: https://num.pyro.ai lang: JAX description: Features an example of how Normalizing flows can be used to get more robust posteriors from Monte Carlo methods. Uses the `numpyro` library which is a PPL with JAX as the backend. The NF implementations include the basic ones like IAF and BNAF. - title: Variational Inference using Normalizing Flows (VINF) date: 2020-11-02 url: https://github.com/pierresegonne/VINF - authors: Pierre Segonne + authors: + - name: Pierre Segonne lang: TensorFlow description: This repository provides a hands-on TensorFlow implementation of Normalizing Flows as presented in the [paper](https://arxiv.org/pdf/1505.05770.pdf) introducing the concept (D. Rezende & S. Mohamed). - title: BERT-flow date: 2019-07-19 url: https://github.com/bohanli/BERT-flow - authors: Bohan Li + authors: + - name: Bohan Li lang: TensorFlow description: TensorFlow implementation of "On the Sentence Embeddings from Pre-trained Language Models" (EMNLP 2020). - title: NormFlows date: 2017-03-21 url: https://github.com/andymiller/NormFlows - authors: Andy Miller + authors: + - name: Andy Miller lang: Other description: Simple didactic example using [`autograd`](https://github.com/HIPS/autograd), so pretty low-level. - title: Normalizing Flows Overview date: 2017-07-11 url: https://www.pymc.io/projects/examples/en/2022.12.0/variational_inference/normalizing_flows_overview.html - authors: PyMC3 + authors: + - name: PyMC3 lang: Other description: A very helpful notebook showcasing how to work with flows in practice and comparing it to PyMC3's NUTS-based HMC kernel. Based on [Theano](https://github.com/Theano/Theano). - title: Destructive Deep Learning (ddl) date: 2018-06-11 url: https://github.com/davidinouye/destructive-deep-learning - authors: David Inouye - authors_url: https://davidinouye.com + authors: + - name: David Inouye + url: https://davidinouye.com lang: Other description: | Code base for the paper [Deep Density Destructors](https://proceedings.mlr.press/v80/inouye18a.html) by Inouye & Ravikumar (2018). An entire suite of iterative methods including tree-based as well as Gaussianization methods which are similar to normalizing flows except they converge iteratively instead of fully parametrized. That is, they still use bijective transforms, compute the Jacobian, check the likelihood and you can still sample and get probability density estimates. The only difference is you repeat the following two steps until convergence: @@ -104,15 +117,17 @@ - title: Unconstrained Monotonic Neural Networks (UMNN) date: 2019-09-19 url: https://github.com/AWehenkel/UMNN - authors: Antoine Wehenkel + authors: + - name: Antoine Wehenkel lang: PyTorch description: Official implementation of "Unconstrained Monotonic Neural Networks" and the experiments presented in the paper. - title: Graphical Normalizing Flows date: 2020-02-04 url: https://github.com/AWehenkel/Graphical-Normalizing-Flows - authors: Antoine Wehenkel - authors_url: https://awehenkel.github.io + authors: + - name: Antoine Wehenkel + url: https://awehenkel.github.io lang: PyTorch description: Official implementation of "Graphical Normalizing Flows" and the experiments presented in the paper. @@ -121,8 +136,9 @@ date_added: 2022-11-10 last_updated: 2022-11-10 url: https://github.com/deeprob-org/deeprob-kit - authors: Lorenzo Loconte - authors_url: https://github.com/loreloc + authors: + - name: Lorenzo Loconte + url: https://github.com/loreloc lang: PyTorch description: | A general-purpose Python library providing a collection of deep probabilistic models (DPMs) which are easy to use and extend. @@ -134,8 +150,9 @@ last_updated: 2022-11-11 url: https://github.com/RameenAbdal/StyleFlow docs: https://rameenabdal.github.io/StyleFlow - authors: Rameen Abdal - authors_url: https://twitter.com/AbdalRameen + authors: + - name: Rameen Abdal + url: https://twitter.com/AbdalRameen lang: PyTorch description: | Attribute-conditioned Exploration of StyleGAN-generated Images using Conditional Continuous Normalizing Flows. @@ -145,8 +162,9 @@ date_added: 2022-11-14 last_updated: 2022-11-14 url: https://github.com/LukasRinder/normalizing-flows - authors: Lukas Rinder - authors_url: https://github.com/LukasRinder + authors: + - name: Lukas Rinder + url: https://github.com/LukasRinder lang: TensorFlow description: | Implementation of normalizing flows (Planar Flow, Radial Flow, Real NVP, Masked Autoregressive Flow (MAF), Inverse Autoregressive Flow (IAF), Neural Spline Flow) in TensorFlow 2 including a small tutorial. diff --git a/data/videos.yml b/data/videos.yml index 94a7570..50d0987 100644 --- a/data/videos.yml +++ b/data/videos.yml @@ -1,56 +1,64 @@ - title: Sylvester Normalizing Flow for Variational Inference url: https://youtu.be/VeYyUcIDVHI date: 2018-10-04 - authors: Rianne van den Berg + authors: + - name: Rianne van den Berg description: Introduces Sylvester normalizing flows which remove the single-unit bottleneck from planar flows for increased flexibility in the variational posterior. - title: Graph Normalizing Flows url: https://youtu.be/frMPP30QQgY date: 2019-09-24 date_added: 2020-05-28 - authors: Jenny Liu + authors: + - name: Jenny Liu description: Introduces a new graph generating model for use e.g. in drug discovery, where training on molecules that are known to bind/dissolve/etc. may help to generate novel, similarly effective molecules. - title: A primer on normalizing flows url: https://youtu.be/P4Ta-TZPVi0 date: 2019-10-09 date_added: 2020-05-28 - authors: Laurent Dinh - authors_url: https://laurent-dinh.github.io + authors: + - name: Laurent Dinh + url: https://laurent-dinh.github.io description: The first author on both the NICE and RNVP papers and one of the first in this field gives an introductory talk at "Machine Learning for Physics and the Physics Learning of, 2019". - title: What are normalizing flows? url: https://youtu.be/i7LjDvsLWCg date: 2019-12-06 date_added: 2020-05-28 - authors: Ari Seff - authors_url: https://scholar.google.com/citations?user=IxBGctYAAAAJ + authors: + - name: Ari Seff + url: https://scholar.google.com/citations?user=IxBGctYAAAAJ description: A great 3blue1brown-style video explaining the basics of normalizing flows. - title: Flow Models url: https://youtu.be/JBb5sSC0JoY date: 2020-02-06 - authors: Pieter Abbeel - authors_url: https://sites.google.com/view/berkeley-cs294-158-sp20/home + authors: + - name: Pieter Abbeel + url: https://sites.google.com/view/berkeley-cs294-158-sp20/home description: A really thorough explanation of normalizing flows. Also includes some sample code. - title: Introduction to Normalizing Flows url: https://youtu.be/u3vVyFVU_lI date: 2020-11-23 - authors: Marcus Brubaker - authors_url: https://mbrubake.github.io + authors: + - name: Marcus Brubaker + url: https://mbrubake.github.io description: A great introduction to normalizing flows by one of the creators of [Stan](https://mc-stan.org) presented at ECCV 2020. The tutorial also provides an excellent review of various practical implementations. - title: Normalizing Flows - Motivations, The Big Idea & Essential Foundations url: https://youtu.be/IuXU2dBOJyw date: 2021-01-16 - authors: Kapil Sachdeva - authors_url: https://github.com/ksachdeva + authors: + - name: Kapil Sachdeva + url: https://github.com/ksachdeva description: A comprehensive tutorial on flows explaining the challenges addressed by this class of algorithm. Provides intuition on how to address those challenges, and explains the underlying mathematics using a simple step by step approach. - title: Normalizing Flows url: https://youtu.be/7TOvhz93G9o date: 2020-12-07 - authors: Marc Deisenroth - authors_url: https://mml-book.github.io/slopes-expectations.html + authors: + - name: Marc Deisenroth + url: https://mml-book.github.io/slopes-expectations.html description: 'Part of a NeurIPS 2020 tutorial series titled "There and Back Again: A Tale of Slopes and Expectations". Link to [full series](https://youtube.com/playlist?list=PL93aLKqThq4h7UpgeNhkOtEeCnX3DMseS).' diff --git a/readme.md b/readme.md index 23c234c..9752f37 100644 --- a/readme.md +++ b/readme.md @@ -359,7 +359,7 @@ Zuko is used in [LAMPE](https://github.com/francois-rozet/lampe) to enable Likel ### JAX  JAX Packages -1. 2024-07-05 - [GWKokab](https://github.com/gwkokab/gwkokab) by Meesum Qazalbash, Muhammad Zeeshan et al. +1. 2024-07-05 - [GWKokab](https://github.com/gwkokab/gwkokab) by [Meesum Qazalbash](https://github.com/Qazalbash), [Muhammad Zeeshan](https://ccrg.rit.edu/user/muhammad.zeeshan) et al.   GitHub repo stars
A JAX-based gravitational-wave population inference toolkit for parametric models [[Docs](https://gwkokab.readthedocs.io)] @@ -515,7 +515,7 @@ Table 1 in the paper has a good comparison with traditional NFs. ## 🌐 Blog Posts (5) -1. 2020-08-19 - [Chapter on flows from the book 'Deep Learning for Molecules and Materials'](https://dmol.pub/dl/flows) by Andrew White
+1. 2020-08-19 - [Chapter on flows from the book 'Deep Learning for Molecules and Materials'](https://dmol.pub/dl/flows) by [Andrew White](https://thewhitelab.org)
A nice introduction starting with the change of variables formula (aka flow equation), going on to cover some common bijectors and finishing with a code example showing how to fit the double-moon distribution with TensorFlow Probability. 1. 2018-10-21 - [Change of Variables for Normalizing Flows](https://nealjean.com/ml/change-of-variables) by Neal Jean