Machines and robots that have the ability to solve an everincreasing number of tasks will continue to be integrated into our everyday lives. What is still lacking for a real breakthrough is a suitable degree of fl exibility and adaptability that will allow a cognitive robot to deal with dynamically changing environments and situations that cannot be designed a priori. In this paper, we review a series of “show-case cognitive robots” in which sensors measure di s tance, contac t , or visual data to provide a suitable input for emergent behaviors that will provide solutions to specifi c goals. In most of our simulations or experiments we created models that follow biological principles found in insects. First, we review two paradigms for animal locomotion: the Central Pattern Generator (CPG) and a refl ex-based approach. We show how simple, contact sensors are able to provide effi cient feedback for sophisticated and adaptive locomotion strategies. Next, we show how some simple (lower level) sensors can be used to train more complex (higher level) ones with data (which are initially nothing more than clusters of pixels) that associate “meanings” to visual details by using bioinspired processing algorithms. The last example shows the emergence of cognitive schemes in spatial-temporal nonlinear neural lattices that are induced by sensory events.

Simple sensors provide inputs for cognitive robots

Patane L.
2009-01-01

Abstract

Machines and robots that have the ability to solve an everincreasing number of tasks will continue to be integrated into our everyday lives. What is still lacking for a real breakthrough is a suitable degree of fl exibility and adaptability that will allow a cognitive robot to deal with dynamically changing environments and situations that cannot be designed a priori. In this paper, we review a series of “show-case cognitive robots” in which sensors measure di s tance, contac t , or visual data to provide a suitable input for emergent behaviors that will provide solutions to specifi c goals. In most of our simulations or experiments we created models that follow biological principles found in insects. First, we review two paradigms for animal locomotion: the Central Pattern Generator (CPG) and a refl ex-based approach. We show how simple, contact sensors are able to provide effi cient feedback for sophisticated and adaptive locomotion strategies. Next, we show how some simple (lower level) sensors can be used to train more complex (higher level) ones with data (which are initially nothing more than clusters of pixels) that associate “meanings” to visual details by using bioinspired processing algorithms. The last example shows the emergence of cognitive schemes in spatial-temporal nonlinear neural lattices that are induced by sensory events.
2009
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11570/3148494
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 10
social impact