We adapt belief-propagation techniques to study the equilibrium behavior of a bipartite spin glass, with interactions between two sets of N and P=αN spins each having an arbitrary degree, i.e., number of interaction partners in the opposite set. An equivalent view is then of a system of N neurons storing P diluted patterns via Hebbian learning, in the high storage regime. Our method allows analysis of parallel pattern processing on a broad class of graphs, including those with pattern asymmetry and heterogeneous dilution; previous replica approaches assumed homogeneity. We show that in a large part of the parameter space of noise, dilution, and storage load, delimited by a critical surface, the network behaves as an extensive parallel processor, retrieving all P patterns in parallel without falling into spurious states due to pattern cross talk, as would be typical of the structural glassiness built into the network. Parallel extensive retrieval is more robust for homogeneous degree distributions, and is not disrupted by asymmetric pattern distributions. For scale-free pattern degree distributions, Hebbian learning induces modularity in the neural network; thus, our Letter gives the first theoretical description for extensive information processing on modular and scale-free networks.

Extensive parallel processing on scale-free networks

BARRA, ADRIANO
2014-01-01

Abstract

We adapt belief-propagation techniques to study the equilibrium behavior of a bipartite spin glass, with interactions between two sets of N and P=αN spins each having an arbitrary degree, i.e., number of interaction partners in the opposite set. An equivalent view is then of a system of N neurons storing P diluted patterns via Hebbian learning, in the high storage regime. Our method allows analysis of parallel pattern processing on a broad class of graphs, including those with pattern asymmetry and heterogeneous dilution; previous replica approaches assumed homogeneity. We show that in a large part of the parameter space of noise, dilution, and storage load, delimited by a critical surface, the network behaves as an extensive parallel processor, retrieving all P patterns in parallel without falling into spurious states due to pattern cross talk, as would be typical of the structural glassiness built into the network. Parallel extensive retrieval is more robust for homogeneous degree distributions, and is not disrupted by asymmetric pattern distributions. For scale-free pattern degree distributions, Hebbian learning induces modularity in the neural network; thus, our Letter gives the first theoretical description for extensive information processing on modular and scale-free networks.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11587/413881
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 47
  • ???jsp.display-item.citation.isi??? 46
social impact