Article
Details
Citation
Swingler K & Smith L (2014) Training and making calculations with mixed order hyper-networks. Neurocomputing, 141, pp. 65-75. https://doi.org/10.1016/j.neucom.2013.11.041
Abstract
A neural network with mixed order weights, n neurons and a modified Hebbian learning rule can learn any function f:{-1,1}n→Rf:{-1,1}n→R and reproduce its output as the network?s energy function. The network weights are equal to Walsh coefficients, the fixed point attractors are local maxima in the function, and partial sums across the weights of the network calculate averages for hyperplanes through the function. If the network is trained on data sampled from a distribution, then marginal and conditional probability calculations may be made and samples from the distribution generated from the network. These qualities make the network ideal for optimisation fitness function modelling and make the relationships amongst variables explicit in a way that architectures such as the MLP do not.
Keywords
High order neural networks;
Optimisation;
Walsh functions
Journal
Neurocomputing: Volume 141
Status | Published |
---|---|
Publication date | 31/10/2014 |
Publication date online | 08/04/2014 |
Date accepted by journal | 27/11/2013 |
URL | |
Publisher | Elsevier |
ISSN | 0925-2312 |
People (2)
Emeritus Professor, Computing Science
Professor, Computing Science