Decision and feature fusion over the fractal inference network using camera and range sensors

1998-11-03
Erkmen, İsmet
Erkmen, Aydan Müşerref
Ucar, E
The objective of the ongoing work is to fuse information from uncertain environmental data taken by cameras, short range sensors including infrared and ultrasound sensors for strategic target recognition and task specific action in Mobile Robot applications. Our present goal in this paper is to demonstrate target recognition for service robot in a simple office environment. It is proposed to fuse all sensory signals obtained from multiple sensors over a fully layer-connected sensor network system that provides an equal opportunity competitive environment for sensory data where those bearing less uncertainty, less complexity and less inconsistencies with the overall goal survive, while others fade out. In our work, this task is achieved as a decision fusion using the Fractal Inference Network (FIN), where information patterns or units- modelled as textured belief functions bearing a fractal dimension due to uncertainty - propagate while being processed at the nodes of the network. Each local process of a node generates a multiresolutional feature fusion. In this model, the environment is observed by multisensors of different type, different resolution and different spatial location without a prescheduled sensing scenario in data gathering. Node activation and flow control of information over the FIN is performed by a neuro-controller, a concept that has been developed recently as an improvement over the classical Fractal Inference Network. In this paper, the mathematical closed form representation for decision fusion over the FIN is developed in a way suitable for analysis and is applied to a NOMAD mobile robot servicing an office environment.
Citation Formats
İ. Erkmen, A. M. Erkmen, and E. Ucar, “Decision and feature fusion over the fractal inference network using camera and range sensors,” 1998, vol. 3523, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/40649.