Skip to content

New research sheds light on how human vision perceives scale


Researchers from Aston University and York University have uncovered new insights into how the human brain makes perceptual judgments of the outside world.

The study, published May 8 in the journal PLUS ONEexplored the computational mechanisms used by the human brain to perceive the size of objects in the world around us.

The research, led by Professor Tim Meese, from Aston University’s School of Optometry, and Dr Daniel Baker, from York University’s Department of Psychology, tells us more about how our visual system can harness the “blur” to infer the perceptual scale. but he does it crudely.

It is well known that in order to derive the size of the object from the size of the retinal image, our visual system needs to estimate the distance to the object. The retinal image contains many pictorial cues, such as linear perspective, that help the system derive the relative size of objects. However, to derive the absolute size, the system needs to know the spatial scale.

By accounting for blur blur, such as the blurry parts of an image outside of a camera’s depth of focus, the visual system can accomplish this. The math behind it has been well worked out by others, but the study asked the question: does human vision exploit this math?

The research team presented participants with photographic pairs of large-scale railway scenes subjected to various artificial blur treatments and small-scale models of railway scenes taken with a long exposure and a small aperture to decrease blur by blur. The task was to detect which photograph of each pair was the actual large-scale scene.

When the artificial blur was oriented correctly with the ground plane (the horizontal plane representing the ground the viewer is standing on) in the large-scale scenes, participants were misled into believing that the small models were the large-scale scenes. scale. Surprisingly, this did not require the application of realistic blur gradients. Simple uniform bands of blur at the top and bottom of the photos achieved almost equivalent miniaturization effects.

Tim Meese, professor of vision sciences at Aston University, said: “Our results indicate that human vision can exploit defocus blur to infer perceptual scale, but it does so in a crude way, more of a heuristic than a metric analysis. Overall, our findings provide new insights into the computational mechanisms used by the human brain in perceptual judgments about the relationship between ourselves and the outside world.”

Daniel Baker, Senior Lecturer in Psychology at York University, said: “These findings demonstrate that our perception of size is not perfect and can be influenced by other properties of a scene. They also highlight the remarkable adaptability of the visual system. This could have relevance to understanding the computational principles that underlie our perception of the world, for example, in judging the size and distance of driving hazards.”


—————————————————-

Source link

🔥📰 For more news and articles, click here to see our full list.🌟✨

👍 🎉Don’t forget to follow and like our Facebook page for more updates and amazing content: Decorris List on Facebook 🌟💯