Toward perception-based navigation using EgoSphere

2001-10-30
Kawamura, K.
Peters II, R.A.
Wilkes, D.M.
Koku, Ahmet Buğra
Sekmen, A.
A method for perception-based egocentric navigation of mobile robots is described. Each robot has a local short-term memory structure called the Sensory EgoSphere (SES), which is indexed by azimuth, elevation, and time. Directional sensory processing modules write information on the SES at the location corresponding to the source direction. Each robot has a partial map of its operational area that it has received a priori. The map is populated with landmarks and is not necessarily metrically accurate. Each robot is given a goal location and a route plan. The route plan is a set of via-points that are not used directly. Instead, a robot uses each point to construct a Landmark EgoSphere (LES) a circular projection of the landmarks from the map onto an EgoSphere centered at the via-point. Under normal circumstances, the LES will be mostly unaffected by slight variations in the via-point location. Thus, the route plan is transformed into a set of via-regions each described by an LES. A robot navigates by comparing the next LES in its route plan to the current contents of its SES. It heads toward the indicated landmarks until its SES matches the LES sufficiently to indicate that the robot is near the suggested via-point. The proposed method is particularly useful for enabling the exchange of robust route informa-tion between robots under low data rate communications constraints. An example of such an exchange is given.

Suggestions

Toward egocentric navigation
Kawamura, Kazuhiko; Koku, Ahmet Buğra; Wilkes, Mitchell; Peters, Richard Alan; Sekmen, Ali (ACTA Press, 2002-11-01)
A method for egocentric navigation of mobile robots is described. Each robot has a local short-term memory structure called the Sensory Egosphere (SES), which is indexed by azimuth, elevation, and time. Directional sensory processing modules write information on the SES at the location corresponding to the source direction. Each robot has a partial map of its operational area that it has received a priori. The map is populated with landmarks and is not necessarily metrically accurate. Each robot is given a ...
A behavior based robot control system using neuro-fuzzy approach
Öğüt, Demet; Alpaslan, Ferda Nur; Department of Computer Engineering (2003)
In autonomous navigation of mobile robots the dynamic environment is a source of problems. Because it is not possible to model all the possible conditions, the key point in the robot control is to design a system that is adaptable to different conditions and robust in dynamic environments. This study presents a reactive control system for a Khepera robot with the ability to navigate in a dynamic environment for reaching goal objects. The main motivation of this research is to design a robot control, which i...
Towards learning affordances : detection of relevant features and characteristics for reachability
Eren, Selda; Şahin, Erol; Department of Information Systems (2006)
In this thesis, we reviewed the affordance concept for autonomous robot control and proposed that invariant features of objects that support a specific affordance can be learned. We used a physics-based robot simulator to study the reachability affordance on the simulated KURT3D robot model. We proposed that, through training, the values of each feature can be split into strips, which can then be used to detect the relevant features and their characteristics. Our analysis showed that it is possible to achie...
Knowledge-sharing techniques for Egocentric Navigation
KESKİNPALA, Türker; Koku, Ahmet Buğra; WİLKES, D Mitch; KAWAMURA, kazuhiko (2003-10-08)
Teams of humans and robots working together can provide effective solutions to problems. In such applications, effective human-robot teaming relies on being able to communicate information about the current perception or understanding of the environment. In this paper, human-robot teaming on navigational tasks is discussed The role of the human user will be to specify the goal point(s) for the robot, and also to interact with the robot in the event of perceptual errors. A novel navigation method, Egocentric...
Human action recognition for mobile robots
Bayram, Ulya; Ulusoy, İlkay; Department of Electrical and Electronics Engineering (2013)
In this thesis, a study of human action recognition with an application on mobile robots is conducted. Movements of the robot depend on human actions estimated after human detection, tracking and action recognition. Therefore, the use of appropriate algorithms for these tasks is proposed. Moving regions are detected by optical flow vectors determined from consecutive frames. People are detected by human recognition and tracking. Each person is assigned a different id and the coordinates of his/her location ...
Citation Formats
K. Kawamura, R. A. Peters II, D. M. Wilkes, A. B. Koku, and A. Sekmen, “Toward perception-based navigation using EgoSphere,” 2001, vol. 4573, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/35515.