Stable improved softmax using constant normalisation

S. Lim, D. Lee

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

In deep learning architectures, rectified linear unit based functions are widely used as activation functions of hidden layers, and the softmax is used for the output layers. Two critical problems of the softmax are introduced, and an improved softmax method to resolve the problems is proposed. The proposed method minimises instability of the softmax while reducing its losses. Moreover, this method is straightforward so its computation complexity is low, but it is substantially reasonable and operates robustly. Therefore, the proposed method can replace the softmax functions.

Original languageEnglish
Pages (from-to)1504-1506
Number of pages3
JournalElectronics Letters
Volume53
Issue number23
DOIs
Publication statusPublished - 9 Nov 2017

Bibliographical note

Publisher Copyright:
© The Institution of Engineering and Technology 2017.

Fingerprint

Dive into the research topics of 'Stable improved softmax using constant normalisation'. Together they form a unique fingerprint.

Cite this