Synthesized Avatars Enable Virtual Walks in 360-Degree Video

Toyohashi University of Technology (TUT)

Overview:

Researchers at the Toyohashi University of Technology and the University of Tokyo developed a system that provides a virtual walking experience to a seated person by real-time synthesis of a walking avatar and its shadow on a 360-degree video with vibrations to the feet. The shadow of the avatar induces an illusory presence of their body. In the future, it is expected to provide an immersive experience for any recorded medium with a virtual embodiment.

Details:

Walking is a fundamental activity for humans and an important exercise that is central to daily activities. A research team from Toyohashi University of Technology and the University of Tokyo is developing a system that provides a virtual walking experience for a seated person. The system aims to provide a walking experience in virtual environments using a 3DCG space and 360-degree live-action video.

In virtual environments, physical information or information for embodiment is an important factor in enhancing the experience. In this study, the walking experience was enhanced by adding physical information about the user, which was not originally included in the 360-degree video experience, and integrating it with the 360-degree video. The physical information comprises a walking avatar (virtual human), shadows created by light projecting onto the avatar, and vibrations created by the feet during walking. The walking experience can be acquired through the avatar and its shadow synthesized with a 360-degree video, and the use of long shadows enhances the sense of leg action and telepresence during walking. The findings of this study has been published in i-Perception on February, 22, 2024.

Development Background:

Various walking devices have been developed in the realm of virtual and metaverse realities. However, most of these involve actual limb movements. Such devices are large, complex, expensive, and not intended for home use. They are also highly dependent on their physical health. One of the features of this study is that a seated person can obtain the experience of walking without moving his or her legs. It is also compact in terms of system and cost and comprises a commercially available head-mounted display (HMD) and four vibrators attached to the feet.

The system is based on an 8 K-resolution 360-degree video of a person walking and moving, which is experienced through an HMD. The avatar composited into the 360-degree video is synchronized with the head movements of the user such that when they turn to the right, the avatar also turns to the right. The movements of the avatar were converted and rendered in real time into a 360-degree video format. The avatar was observed from a first-person perspective in which that of the user coincided with that of the avatar, which always coincided with the center of the 360-degree video. The shadow of the avatar is rendered directly in front of the travel direction and is always observed, even in the limited field of view of the HMD.

When the avatar walks and its feet land on the ground, the vibrations reach the feet of the user (heel and forefoot). Presenting an avatar and its motion with synchronized foot vibrations produces a powerful walking sensation.

Future Outlook:

Further development of a system that enables people to experience walking without moving their limbs is expected to provide enjoyment for several experiences that begin with walking, regardless of physical limitations, and improve the quality of life. In addition, by supporting formats that can record space, such as a 360-degree video, the system is expected to be used in several situations and deployed to a wider range of people. If the device becomes more compact, it will be possible to walk or travel home.

Reference:

Nakamura, J., Ikei, Y., and Kitazaki, M. (2023). Effects of self-avatar cast shadow and foot vibration on telepresence, virtual walking experience, and cybersickness from omnidirectional movie, i-Perception, 15(1). https://doi.org/10.1177/20416695241227857

Acknowledgements:

This research was supported in part by JST ERATO (JPMJER1701) for MK, JSPS KAKENHI JP22J21664 for JN, JP18H04118 for YI, and JP23H03882 for MK.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.