A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.

Information-Preserving Networks and the Mirrored Transform

Palmieri F.;Di Gennaro G.
Membro del Collaboration Group
;
2019

Abstract

A deep network with rectifier units (ReLUs) is used for building a multi-layer transform with the constraint that the input can be reconstructed in the backward flow. The structure is analysed in reference to the number of units necessary in each layer to preserve information. Specific networks with random and fixed weights are presented and the Mirrored Transform is proposed. The resulting network generalizes classical linear transforms and provides a progressive unfolding of the input space into embeddings that can determine the basis for classification and filtering in an unsupervised manner.
2019
978-1-7281-0824-7
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11591/429175
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact