Perceived animacy from global and local image distortions

2025-9-11
Türkmen, Yunus Emre
Animacy perception is the ability to discriminate between visual stimuli as animate or inanimate. The effects of global shape structure and mid-level scales, such as symmetry, and curvilinearity on perceived animacy are highly studied; however, the relationship between local image distortion and animacy perception is lacking in literature. In Experiment 1, I morphed animate-inanimate pairs and asked participants to classify morphed visual stimuli as animate or inanimate within 150 milliseconds to observe the impact of overall global shape cues on perceived animacy. Although previous literature has shown that local distortions can create an experience of transparent layers (Dövencioğlu et al., 2018), the effects of these distortions on animacy perception are unknown. In Experiment 2, I investigated how local image distortions affect animacy perception through eidolons. Participants adjusted reach-grain parameters to distort the local disarray of 20 images and create equivalent appearance classes (eidolons). Parameter adjustments for the three modes ("underwater," "behind glass," or "animate") were significantly different from one another. As curvature increased, participants adjusted lower values for both parameters. In Experiment 3, a 2AFC experiment, participants judged the perceived animacy of symmetrical insect-like stimuli in their fiducial forms and nine equivalence classes. Again, the global curvature and perceived animacy increased together, and local curvature manipulations yielded idiosyncratic results. While these findings confirm that local distortions in the image can create transparent percepts, they also indicate associations between global and local shape cues and perceived animacy, suggesting that both contribute to the discrimination of animate from inanimate.
Citation Formats
Y. E. Türkmen, “Perceived animacy from global and local image distortions,” M.S. - Master of Science, Middle East Technical University, 2025.