我要吃瓜

Conference Paper (published)

The Variational InfoMax AutoEncoder

Details

Citation

Crescimanna V & Graham B (2020) The Variational InfoMax AutoEncoder. In: 2020 International Joint Conference on Neural Networks. IEEE International Joint Conference on Neural Networks (IJCNN) IJCNN 2020 - International Joint Conference on Neural Networks, Glasgow, UK, 19.07.2020-24.07.2020. Piscataway, NJ: IEEE. https://doi.org/10.1109/IJCNN48605.2020.9207048

Abstract
The Variational AutoEncoder (VAE) learns simultaneously an inference and a generative model, but only one of these models can be learned at optimum, this behaviour is associated to the ELBO learning objective, that is optimised by a non-informative generator. In order to solve such an issue, we provide a learning objective, learning a maximal informative generator while maintaining bounded the network capacity: the Variational InfoMax (VIM). The contribution of the VIM derivation is twofold: an objective learning both an optimal inference and generative model and the explicit definition of the network capacity, an estimation of the network robustness.

Keywords
Generators; Entropy; Mutual information; Robustness; Task analysis; Encoding; Data models

StatusPublished
Title of seriesIEEE International Joint Conference on Neural Networks (IJCNN)
Publication date31/12/2020
Publication date online19/11/2020
URL
PublisherIEEE
Place of publicationPiscataway, NJ
ISSN of series2161-4407
ISBN978-1-7281-6927-9
eISBN978-1-7281-6926-2
ConferenceIJCNN 2020 - International Joint Conference on Neural Networks
Conference locationGlasgow, UK
Dates

People (1)

Professor Bruce Graham

Professor Bruce Graham

Emeritus Professor, Computing Science

Files (1)