Knowledge exchange project explores virtual reality in museums

University College London

Virtual Reality (VR) in museums is the focus of a newly funded knowledge exchange project that brings together UCL Culture, UCL's Bartlett Centre for Advanced Spatial Analysis (CASA) and AI technology start-up Kagenova, which was founded by UCL scientists.

Man wearing a VR headset in a museum with sculptures behind him

The project will enable these highly specialised teams to come together to share ideas, experiences and skills to support innovation in VR in the museum sector.

Kagenova has developed new technology out of the exploration of the unknown horizons of the night sky by UCL scientists in astrophysics. This will allow them to provide photo-realism and interactivity in VR, at the same time and at scale. These new immersive technologies are powered by Kagenova's geometric AI techniques, which are tailored specifically to VR.

Professor of astrostatistics and astroinformatics Jason McEwan (UCL Space and Climate Physics), who is also founder and CEO of Kagenova, said: "We're excited by this knowledge exchange partnership to help explore how technology could be used to support cultural exhibitions and make them more accessible."

The collaboration will explre the ways in which virtual exhibitions captured in this way can enhance the student learning experience (blended and online). This will benefit of both UCL and the wider Higher Education community, as well enabling the general public to engage with collections in new ways, wherever they are in the world.

The findings from this project will be particularly beneficial for smaller museums, who often don't have the budget or infrastructure to compete with national museums when developing new digital products and services.

The collaboration with Kagenova enables UCL Culture to experiment with next generation 360VR technology. UCL's museums have been pioneering collections and exhibitions-led interdisciplinary research and inquiry for over a decade.

Dr Nina Pearlman (UCL Art Collections), who has been the strategic lead for this project, said: "Our exhibitions are integrated into the learning experience in different ways and we want to bring this pedagogical know-how into conversations with technology in a way that can bridge students' onsite and remote learning experiences, as well as connect with our remote audiences.

"Collaboration supported by knowledge exchange funding is key to being able to identify novel ideas that can help us design better solutions, services and experiences to meet our current challenges."

Researchers from UCL CASA have been exploring virtual environments for many years with a focus on technological innovation and accessibility, nurturing the next generation of developers and technology innovators.

Professor of Digital Urban Systems, Andy Hudson-Smith (UCL CASA) said: "The collaboration brings the ability to explore the edges of the technology, from multisite portals through to 3D objects embedded in the panoramic scenes and integration into emerging metaverses. All with a direct loop back into teaching and research at UCL".

Building on UCL's strength in innovation, the learnings from this project will be disseminated and shared across cultural and technology networks, the education sector and the media. Without this project, these learnings would be either logistically impossible or prohibitively expensive to achieve individually.

This knowledge exchange project is supported by UCL's Higher Innovation Fund (HEIF), managed by UCL Innovation & Enterprise. The partnership between UCL Culture, UCL CASA and start-up Kagenova has been facilitated by the Business and Innovation Partnerships team within UCL Innovation & Enterprise.

  • UCL Culture
  • /Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.