An efficient dilution strategy for constructing sparsely connected neural networks

Montemurro, Marcelo A. and Tamarit, Francisco A. (2001). An efficient dilution strategy for constructing sparsely connected neural networks. Physica A: Statistical Mechanics and its Applications, 294(3-4) pp. 340–350.

DOI: https://doi.org/10.1016/S0378-4371(01)00123-6

Abstract

In this paper we present a modification of the strongly diluted Hopfield model in which the dilution scheme, instead of being random, is inspired on biological grounds. We also show that the resulting sparsely connected neural network shows striking features in terms of performance as an associative memory device, even for noisy correlated patterns. We state analytically that under certain conditions the model does not have any upper value of the parameter α=p/C (where p is the number of patterns stored in the system and C is the coordination number of each neuron) beyond which it becomes useless as a memory device, as it is found for other models. By means of numerical simulations we demonstrate that under much weaker assumptions the neural network still performs highly efficiently in good agreement with the analytical results.

Viewing alternatives

Metrics

Public Attention

Altmetrics from Altmetric

Number of Citations

Citations from Dimensions
No digital document available to download for this item

Item Actions

Export

About