We construct a single layer feed forward network and analyze the constructed system using information theoretic tools, such as mutual information and data processing inequality. We derive a threshold on the number of hidden nodes required to achieve a good classification performance. Classification performance is expected to saturate as we increase the number of hidden nodes more than the threshold. The threshold is further verified by experimental studies on benchmark datasets. Index Terms-Neural networks, mutual information, extreme learning machine, invertible function.
QC 20190403