In this paper a recently introduced and well promising methodology for robot perception is applied to autonomously learn robot navigation in an unstructured environment. Perception is here considered as the spontaneous, environmentally mediated, emergence of Turing patterns in CNNs as perceptual states. They, plastically associated to suitable actions, lead the robot to solve autonomously its task. Following this concept, robot behavior (in this case navigation) is reflected, in a virtual navigation through the different basins of attraction of the generated patterns, within the robot control neural network. The whole architecture was implemented in an FPGA-based hardware embedded on a roving robot. In the manuscript the perceptual architecture together with experimental results on a roving robot, will be reported. ©2008 IEEE.
File in questo prodotto:
Non ci sono file associati a questo prodotto.