Abstract: The model of neural networks on the small-world
topology, with metric (local and random connectivity) is investigated.
The synaptic weights are random, driving the network towards a
chaotic state for the neural activity. An ordered macroscopic neuron
state is induced by a bias in the network connections. When the
connections are mainly local, the network emulates a block-like
structure. It is found that the topology and the bias compete to
influence the network to evolve into a global or a block activity
ordering, according to the initial conditions.
Abstract: An attractor neural network on the small-world topology
is studied. A learning pattern is presented to the network, then
a stimulus carrying local information is applied to the neurons and
the retrieval of block-like structure is investigated. A synaptic noise
decreases the memory capability. The change of stability from local
to global attractors is shown to depend on the long-range character
of the network connectivity.