In this paper a recently introduced and well promising methodology for robot perception is applied to autonomously learn robot navigation in an unstructured environment. Perception is here considered as the spontaneous, environmentally mediated, emergence of Turing patterns in CNNs as perceptual states. They, plastically associated to suitable actions, lead the robot to solve autonomously its task. Following this concept, robot behavior (in this case navigation) is reflected, in a virtual navigation through the different basins of attraction of the generated patterns, within the robot control neural network. The whole architecture was implemented in an FPGA-based hardware embedded on a roving robot. In the manuscript the perceptual architecture together with experimental results on a roving robot, will be reported. ©2008 IEEE.
Implementation of a CNN-based perceptual framework on a roving robot
Patane L.
2008-01-01
Abstract
In this paper a recently introduced and well promising methodology for robot perception is applied to autonomously learn robot navigation in an unstructured environment. Perception is here considered as the spontaneous, environmentally mediated, emergence of Turing patterns in CNNs as perceptual states. They, plastically associated to suitable actions, lead the robot to solve autonomously its task. Following this concept, robot behavior (in this case navigation) is reflected, in a virtual navigation through the different basins of attraction of the generated patterns, within the robot control neural network. The whole architecture was implemented in an FPGA-based hardware embedded on a roving robot. In the manuscript the perceptual architecture together with experimental results on a roving robot, will be reported. ©2008 IEEE.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.