Show/Hide Menu
Hide/Show Apps
anonymousUser
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Frequently Asked Questions
Frequently Asked Questions
Communities & Collections
Communities & Collections
Learning affordances for categorizing objects and their properties
Date
2010-11-18
Author
Dag̃, Nilgün
Atil, Ilkay
Kalkan, Sinan
Şahin, Erol
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
6
views
0
downloads
In this paper, we demonstrate that simple interactions with objects in the environment leads to a manifestation of the perceptual properties of objects. This is achieved by deriving a condensed representation of the effects of actions (called effect prototypes in the paper), and investigating the relevance between perceptual features extracted from the objects and the actions that can be applied to them. With this at hand, we show that the agent can categorize (i.e., partition) its raw sensory perceptual feature vector, extracted from the environment, which is an important step for development of concepts and language. Moreover, after learning how to predict the effect prototypes of objects, the agent can categorize objects based on the predicted effects of actions that can be applied on them.
Subject Keywords
Feature extraction
,
Prototypes
,
Shape
,
Robots
,
Support vector machines
,
Clocks
,
Visualization
URI
https://hdl.handle.net/11511/42397
DOI
https://doi.org/10.1109/icpr.2010.1146
Collections
Department of Computer Engineering, Conference / Seminar