VISUAL PERCEPTION OF HUMAN AND ROBOT MOTION WITHIN THE FRAMEWORK OF PREDICTIVE PROCESSING

2025-8-18
Özen, Gizem
The main aim of the present thesis is to explore how long-term and short-term prior knowledge interact in shaping visual perception of form and motion. By conducting three experiments, behavioral and electrophysiological methods were combined to examine how congruency between visual form and motion modulates perception. The first experiment stands as a motion norming study in which four mechanical motion types are produced by disrupting biological motion capture data and introduced to participants. Along with the motion types, appearance, and action types are ascribed as independent variables in the measurement of perceived mechanicalness. The results showed that all variables affect perceived mechanicalness. Moreover, the motion type that is found to be the most mechanical is used as the main motion stimulus in the second and third experiments. These experiments included two animated characters, a human and a robot, that move either expectedly or unexpectedly based on our life-long experiences (long-term prior). We used these stimuli in a probabilistic-cueing paradigm (\%75 validity) in which two types of short-term priors (cue) were introduced, one being visual and the other being kinematic, followed by animations of human or robot characters (target). Participants were engaged in a motion identification task, trying to decide between biological and mechanical motion. The target videos were introduced with different noise levels to disrupt the clarity of the visual stimuli. By making the reliability of the long-term prior weak, we lead participants to use short-term priors on the task (de Lange et al., 2018). In the second experiment, we measured reaction time and accuracy to see the relationship between the short-term priors and the long-term knowledge. The behavioral results showed that when target videos comply with our long-term knowledge, short-term priors are not utilized. When participants watched a human agent moving biologically or a robot agent moving mechanically, it did not matter what cue (human or robot) they were shown beforehand; the measurements didn't significantly differ. This suggests that short-term priors do not play much role when the target itself is expected, but rather long-term prior knowledge is used. On the other hand, when target videos are unexpected, short-term priors come into play. When participants were presented with a human moving mechanically, they responded faster and more accurately if the preceding cue was a robot than if it was a human. This suggests the utilization of a short-term kinematic prior when the target was incongruent, which in turn implies an overriding effect of the kinematic short-term prior over the visual short-term prior. In the third experiment, we tried to show the neural correlates of the overriding effects found in the behavioral study by measuring alpha and beta oscillations in the Action Observation Network (AON) through EEG. The electrophysiological results showed that both alpha and beta band activities are sensitive to the congruency between form and motion; however, neither of them is sensitive to the prior types.
Citation Formats
G. Özen, “VISUAL PERCEPTION OF HUMAN AND ROBOT MOTION WITHIN THE FRAMEWORK OF PREDICTIVE PROCESSING,” Ph.D. - Doctoral Program, Middle East Technical University, 2025.