Human action recognition for mobile robots

Bayram, Ulya
In this thesis, a study of human action recognition with an application on mobile robots is conducted. Movements of the robot depend on human actions estimated after human detection, tracking and action recognition. Therefore, the use of appropriate algorithms for these tasks is proposed. Moving regions are detected by optical flow vectors determined from consecutive frames. People are detected by human recognition and tracking. Each person is assigned a different id and the coordinates of his/her location at each frame are collected. Then, extracted action recognition features are separated using the coordinates belonging to each person so that action recognition is conducted for each person separately. Furthermore, to make a distinction between action cycles, the idea of performing recognition in short frame sequences is proposed. Two feature extraction methods are selected and compared for action recognition: cuboids and tracklets. Based on the obtained results, the advantages of tracklets over cuboids are proven. Then, a literature search is conducted for classification of the tracklet features. The codebook based method, which is the most popularly used method in the action recognition literature, is experimented. Weaknesses of the codebook based method are shown for classification of features extracted from short frame sequences. Thus, a novel classification method is proposed based on the idea of iterative matching. Success of the proposed method over the codebook method is proven with various tests in terms of accuracy and computational time consumption. With the final design of tracking and action recognition system, the mobile robot, experimented on Pioneer2, is commanded instructions: move forward, turn right or left, give alarm to warn authorities.


Human arm mimicking using visual data
Uskarcı, Algan; Ersak, Aydın; Department of Electrical and Electronics Engineering (2004)
This thesis analyzes the concept of robot mimicking in the field of Human-Machine Interaction (HMI). Gestures are investigated for HMI applications and the preliminary work of the mimicking of a model joint with markers is presented. Two separate systems are proposed finally which are capable of detecting a moving human arm in a video sequence and calculating the orientation of the arm. The angle of orientation found is passed to robot arm in order to realize robot mimicking. The simulations show that it is...
Reshaping human intentions by autonomous sociable robot moves through intention transients generated by elastic networks considering human emotions
Görür, Orhan Can; Erkmen, Aydan Müşerref; Department of Electrical and Electronics Engineering (2014)
This thesis focuses on reshaping a previously detected human intention into a desired one, using contextual motions of mobile robots, which are in our applications, autonomous mobile 2-steps and a chair. Our system first estimates the current intention based on human trajectory depicted as location and detects body-mood of the person based on proxemics behaviors. Our previous reshaping applications have shown that the current human intention has to be deviated towards the new desired one in phases according...
Robot planing based on learned affordances
Çakmak, Maya; Şahin, Erol; Department of Computer Engineering (2007)
This thesis studies how an autonomous robot can learn affordances from its interactions with the environment and use these affordances in planning. It is based on a new formalization of the concept which proposes that affordances are relations that pertain to the interactions of an agent with its environment. The robot interacts with environments containing different objects by executing its atomic actions and learns the different effects it can create, as well as the invariants of the environments that aff...
Steering self-organized robot flocks through externally guided individuals
Celikkanat, Hande; Şahin, Erol (Springer Science and Business Media LLC, 2010-09-01)
In this paper, we study how a self-organized mobile robot flock can be steered toward a desired direction through externally guiding some of its members. Specifically, we propose a behavior by extending a previously developed flocking behavior to steer self-organized flocks in both physical and simulated mobile robots. We quantitatively measure the performance of the proposed behavior under different parameter settings using three metrics, namely, (1) the mutual information metric, adopted from Information ...
Software development for man-machine interface for an industrial robot
Cengiz, Mahir Cihan; Kaftanoğlu, Bilgin; Department of Mechanical Engineering (2003)
In this study, a robotic software, which controls the robot, is developed. The robot considered is a six degree of freedom robot and it is designed and manufactured in METU. User can send the robot anywhere in space within its workspace, in any orientation. Forward and inverse kinamatics can be executed according to the needs. Simulation framework is embedded into the software for the 3D visualisation of the robot. Any movements can be simulated on the screen. Software also generates the path for the given ...
Citation Formats
U. Bayram, “Human action recognition for mobile robots,” M.S. - Master of Science, Middle East Technical University, 2013.