Conference Paper (published)
Details
Citation
Crescimanna V & Graham B (2020) The Variational InfoMax AutoEncoder. In: 2020 International Joint Conference on Neural Networks. IEEE International Joint Conference on Neural Networks (IJCNN) IJCNN 2020 - International Joint Conference on Neural Networks, Glasgow, UK, 19.07.2020-24.07.2020. Piscataway, NJ: IEEE. https://doi.org/10.1109/IJCNN48605.2020.9207048
Abstract
The Variational AutoEncoder (VAE) learns simultaneously an inference and a generative model, but only one of these models can be learned at optimum, this behaviour is associated to the ELBO learning objective, that is optimised by a non-informative generator. In order to solve such an issue, we provide a learning objective, learning a maximal informative generator while maintaining bounded the network capacity: the Variational InfoMax (VIM). The contribution of the VIM derivation is twofold: an objective learning both an optimal inference and generative model and the explicit definition of the network capacity, an estimation of the network robustness.
Keywords
Generators; Entropy; Mutual information; Robustness; Task analysis; Encoding; Data models
Status | Published |
---|---|
Title of series | IEEE International Joint Conference on Neural Networks (IJCNN) |
Publication date | 31/12/2020 |
Publication date online | 19/11/2020 |
URL | |
Publisher | IEEE |
Place of publication | Piscataway, NJ |
ISSN of series | 2161-4407 |
ISBN | 978-1-7281-6927-9 |
eISBN | 978-1-7281-6926-2 |
Conference | IJCNN 2020 - International Joint Conference on Neural Networks |
Conference location | Glasgow, UK |
Dates | – |
People (1)
Emeritus Professor, Computing Science