Color-based object tracking to improve mobile robot navigation

2002-07-18
Keskinpala, T
Koku, Ahmet Buğra
Keskinpala, HK
Recognition and tracking of certain objects of interest is an important aspect that a robotic system has to have for a variety of reasons. One use of tracking is introduced with behavior-based robotics where certain objects, or object distributions are used as triggers for different behaviors. Landmark based navigation is yet another example. Although it is possible to execute tracking by using different sensors, vision is mostly preferred. In this paper, a color-based object tracking system is introduced.

Suggestions

Face detection in active robot vision
Önder, Murat; Halıcı, Uğur; Department of Electrical and Electronics Engineering (2004)
The main task in this thesis is to design a robot vision system with face detection and tracking capability. Hence there are two main works in the thesis: Firstly, the detection of the face on an image that is taken from the camera on the robot must be achieved. Hence this is a serious real time image processing task and time constraints are very important because of this reason. A processing rate of 1 frame/second is tried to be achieved and hence a fast face detection algorithm had to be used. The Eigenfa...
Visual detection and tracking of moving objects
Ergezer, Hamza; Leblebicioğlu, Mehmet Kemal (2007-06-13)
In this paper, primary steps of a visual surveillance system are presented: moving object detection and tracking of these moving objects. Running average method has been used to detect the moving objects in the video, which is taken from a static camera. Tracking of foreground objects has been realized by using a Kalman filter. After background subtraction, morphological operators are used to remove noises detected as foreground. Active contour models (snakes) are the segmentation tools for the extracted fo...
The learning and use of traversability affordance using range images on a mobile robot
Ugur, Emre; Dogar, Mehmet R.; Cakmak, Maya; Şahin, Erol (2007-04-14)
We are interested in how the concept of affordances can affect our view to autonomous robot control, and how the results obtained from autonomous robotics can be reflected back upon the discussion and studies on the concept of affordances. In this paper, we studied how a mobile robot, equipped with a 3D laser scanner, can learn to perceive the traversability affordance and use it to wander in a room filled with spheres, cylinders and boxes. The results showed that after learning, the robot can wander around...
INTEGRATION OF 2D IMAGES AND RANGE DATA FOR OBJECT SEGMENTATION AND RECOGNITION
Bayramoglu, Neslihan; Akman, Oytun; Alatan, Abdullah Aydın; Jonker, Pieter (2009-09-11)
In the field of vision based robot actuation, in order to manipulate objects in an environment, background separation and object selection a re fundamental tasks that should be carried out in a fast and efficient way. In this paper, we propose a method to segment possible object locations in the scene and recognize them via local-point based representation. Exploiting the resulting 3D structure of the scene via a time-of-flight camera, background regions are eliminated with the assumption that the objects a...
Camera motion blur and its effect on feature detectors
Üzer, Ferit; Saranlı, Afşar; Department of Electrical and Electronics Engineering (2010)
Perception, hence the usage of visual sensors is indispensable in mobile and autonomous robotics. Visual sensors such as cameras, rigidly mounted on a robot frame are the most common usage scenario. In this case, the motion of the camera due to the motion of the moving platform as well as the resulting shocks or vibrations causes a number of distortions on video frame sequences. Two most important ones are the frame-to-frame changes of the line-of-sight (LOS) and the presence of motion blur in individual fr...
Citation Formats
T. Keskinpala, A. B. Koku, and H. Keskinpala, “Color-based object tracking to improve mobile robot navigation,” 2002, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/54215.