Maximum A Posteriori Decision and Evaluation of Class Probabilities by Boltzmann Perceptron Classifiers
Abstract
Neural networks offer a valuable alternative to Bayesian classifiers in evaluating a posteriori class probabilities for classifying stochastic patterns. In contrast to the Bayesian classifier, the “neural” classifier makes no assumptions on the probabilistic nature of the problem, and is thus universal in the sense that it is not restricted to an underlying probabilistic model. Instead, it adjusts itself to a given training set by a learning algorithm, and thus, can learn the stochastic properties of the specific problem. Evaluation of the a posteriori probabilities can be computed, in principle, by stochastic networks such as the Boltzmann machine. However, these networks are computationally extremely inefficient. In this paper we show that the a posteriori class probabilities can be efficiently computed by a deterministic feedforward network which we call the Boltzmann Perceptron Classifier (BPC). Maximum a posteriori (MAP) classifiers are also constructed as a special case of the BPC. Structural relationship between the BPC and a conventional multilayer Perceptron (MLP) are given, and it is demonstrated that rather intricate boundaries between classes can be formed even with a relatively modest number of network units. Simulation results show that the BPC is comparable in performance to a Bayesian classifier although no assumptions on the probabilistic model of the problem are assumed for the BPC. © 1990, IEEE