我要吃瓜

Article

Training and making calculations with mixed order hyper-networks

Details

Citation

Swingler K & Smith L (2014) Training and making calculations with mixed order hyper-networks. Neurocomputing, 141, pp. 65-75. https://doi.org/10.1016/j.neucom.2013.11.041

Abstract
A neural network with mixed order weights, n neurons and a modified Hebbian learning rule can learn any function f:{-1,1}n→Rf:{-1,1}n→R and reproduce its output as the network?s energy function. The network weights are equal to Walsh coefficients, the fixed point attractors are local maxima in the function, and partial sums across the weights of the network calculate averages for hyperplanes through the function. If the network is trained on data sampled from a distribution, then marginal and conditional probability calculations may be made and samples from the distribution generated from the network. These qualities make the network ideal for optimisation fitness function modelling and make the relationships amongst variables explicit in a way that architectures such as the MLP do not.

Keywords
High order neural networks; Optimisation; Walsh functions

Journal
Neurocomputing: Volume 141

StatusPublished
Publication date31/10/2014
Publication date online08/04/2014
Date accepted by journal27/11/2013
URL
PublisherElsevier
ISSN0925-2312

People (2)

Professor Leslie Smith

Professor Leslie Smith

Emeritus Professor, Computing Science

Professor Kevin Swingler

Professor Kevin Swingler

Professor, Computing Science