A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.
Information-Preserving Networks and the Mirrored Transform
Palmieri F.;Di Gennaro G.Membro del Collaboration Group
;
2019
Abstract
A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.File in questo prodotto:
Non ci sono file associati a questo prodotto.
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.