Starting from the biological background on the olfactory architecture of both insects and mammalians, different nonlinear systems able to respond to spatial-distributed external stimuli with spatial–temporal dynamics have been investigated in the last decade. Among these, there is a class of neural networks that produces quasi-periodic trajectories that pass near heteroclinic contours and prove to be global attractors for the system. For this reason, these networks are called winnerless competition (WLC) networks. The sequence of saddle points crossed by each trajectory depends on the spatial input presented to the network and can be used as a ‘code’ representing a specific class of stimuli. Thanks to the intrinsic discrimination, WLC networks are often used for classification. In this paper, this capability is exploited within a framework for action-oriented perception. WLC networks are here used as bio-inspired architectures for the association between stimuli and ‘percepts’. After presenting the theoretical basis of the WLC network in the classic Lotka–Volterra system, we investigate how WLC networks can be formalized in terms of cellular nonlinear networks (CNNs) hosting different kinds of cells: the FitzHugh–Nagumo neuron, the Izhikevich neuron and the single layer CNN standard cell. In order to find efficient ways to code environmental stimuli for action generation, we analyze and compare these WLC-based CNNs in terms of number of generated classes and robustness against the initial conditions. Based on the simulation results, we apply the best-performing system to solve a perceptual task involving navigation and obstacle avoidance. We demonstrate how the large memory capacity shown by the WLC–CNN is able to contribute to the new perceptual framework for autonomous artificial agents, where the association between stimuli and sequences is learned through the experience.

The winnerless competition paradigm in cellular nonlinear networks: Models and applications

Patane L.
;
2009-01-01

Abstract

Starting from the biological background on the olfactory architecture of both insects and mammalians, different nonlinear systems able to respond to spatial-distributed external stimuli with spatial–temporal dynamics have been investigated in the last decade. Among these, there is a class of neural networks that produces quasi-periodic trajectories that pass near heteroclinic contours and prove to be global attractors for the system. For this reason, these networks are called winnerless competition (WLC) networks. The sequence of saddle points crossed by each trajectory depends on the spatial input presented to the network and can be used as a ‘code’ representing a specific class of stimuli. Thanks to the intrinsic discrimination, WLC networks are often used for classification. In this paper, this capability is exploited within a framework for action-oriented perception. WLC networks are here used as bio-inspired architectures for the association between stimuli and ‘percepts’. After presenting the theoretical basis of the WLC network in the classic Lotka–Volterra system, we investigate how WLC networks can be formalized in terms of cellular nonlinear networks (CNNs) hosting different kinds of cells: the FitzHugh–Nagumo neuron, the Izhikevich neuron and the single layer CNN standard cell. In order to find efficient ways to code environmental stimuli for action generation, we analyze and compare these WLC-based CNNs in terms of number of generated classes and robustness against the initial conditions. Based on the simulation results, we apply the best-performing system to solve a perceptual task involving navigation and obstacle avoidance. We demonstrate how the large memory capacity shown by the WLC–CNN is able to contribute to the new perceptual framework for autonomous artificial agents, where the association between stimuli and sequences is learned through the experience.
2009
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11570/3148729
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 13
social impact