Volume 10, Number 10, July 2020
Coding with Logistic Softmax Sparse Units
Authors
Gustavo A. Lado and Enrique C. Segura, Universidad de Buenos Aires, Argentina
Abstract
This paper presents a new technique for efficient coding of highly dimensional vectors, overcoming the typical drawbacks of classical approaches, both, the type of local representations and those of distributed codifications. The main advantages and disadvantages of these classical approaches are revised and a novel, fully parameterized strategy, is introduced to obtain representations of intermediate levels of locality and sparsity, according to the necessities of the particular problem to deal with. The proposed method, called COLOSSUS (COding with LOgistic Softmax Sparse UnitS) is based on an algorithm that permits a smooth transition between both extreme behaviours -local, distributed- via a parameter that regulates the sparsity of the representation. The activation function is of the logistic type. We propose an appropriate cost function and derive a learning rule which happens to be similar to the Oja's Hebbian learning rule. Experiments are reported showing the efficiency of the proposed technique.
Keywords
Neural Networks, Sparse Coding, Autoencoders.