Time Series Modeling by Perceptrons: A Likelihood Approach
Authors :
Sonmez, Kemal M
Conference : The World Congress on Neural Networks Vol. IV, pp. 601-604
Date: July 01 - July 01, 1993
We consider neural network learning problems in which the objective is to learn the relationship between the inputs and the probability distribution of a proposition. We regard successive truth values of the proposition as a dependent binary time series whose instantaneous probability of truth is a function of the past behavior of the joint process between the analog inputs and binary output truth values. In this context, we identify the gradient descent learning algorithm using the Kullback-Leibler relative entropy cost function on a perceptron with a Maximum Partial Likelihood (MPL) estimator of the perceptron model for the probability of a binary event in terms of its covariates. The implications of this result are: (i) The neural network models obtained by relative entropy learning are shown to have the nice large sample (i.e. training set size) characteristics of MPL estimates: consistency and asymptotic normality. (ii) An important and widely used statistical inference technique, logistic regression, can efficiently be implemented on analog perceptrons for time series modeling and prediction.