Should you believe your eyes? Not necessarily in virtual reality says new study

A recent study by Western neuroscientists suggests that, unlike true reality, perception in virtual reality is more strongly influenced by our expectations than the visual information before our eyes.

The researchers point to the challenge of online shopping, where customers sometimes mis-estimate the size of a product based on their expectations, discovering for example that a sweater purchased online is indeed lovely but sized for a doll not an adult.

This happens in part because the physical cues to size that are present when seeing an item in a store are typically eliminated when viewing photos online. Without seeing the physical object, customers base their expectations of familiar size on prior experience. Since most sweaters are sized for people, not dolls, the visual system assumes that an unfamiliar sweater is, too.

The advent of virtual reality offers new opportunities for applications like online shopping and also for research on visual perception. But researchers wanted to understand if users of virtual reality perceive size as accurately as they do in the real world.

Jody Culham

A research team, led by Canada Research Chair in Immersive Neuroscience Jody Culham, presented study participants with a variety of familiar objects like dice and sports balls in virtual reality and asked them to estimate the object sizes. The trick? Objects were presented not only at their typical 'familiar' sizes, but also at unusual sizes (e.g., die-sized Rubik's cubes).

The researchers found that participants consistently perceived the virtual objects at the size they expected, rather than the actual presented size. This effect was much stronger in virtual reality than for real objects.

Illustration by Rob Potter

"While virtual reality is a useful research tool with many real-world applications, we cannot assume it is always an accurate proxy for the real world," said Culham, a psychology professor and senior author on the study. "It is promising to see advances in virtual reality and its applications, but there is still a lot we don't understand about how we process information in virtual environments. If we need to rely heavily on past experiences to judge the size of objects in virtual reality, this suggests other visual cues to size may be less reliable than in the real world."

Yet, the results of this study also have some promising implications.

Anna Rzepka

"If we know that familiar objects can serve as strong size cues in virtual reality, we can use this information to our advantage," said Anna Rzepka, a former student in the Culham Lab and co-first author on the study. "Think about viewing an item in a scene where accurate size perception is crucial, such as when removing a tumour using image-guided surgery. Adding other familiar objects to the virtual scene could improve perception of the tumour's size and location, leading to better outcomes."

The findings were published in Philosophical Transactions of the Royal Society B - the world's oldest scientific journal in continuous publication. The paper is part of a special issue based on a Royal Society workshop on 'New Approaches to 3D Vision' that included discussions about the promises and limitations of virtual reality.

This research was funded by a Discovery Grant from the Natural Sciences and Engineering Research Council (NSERC) of Canada.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.