Most engineering concepts can be complex and abstract, making it difficult to understand them. Virtual Reality (VR) technology can be a paradigm shift in education with high-quality, inclusive, and sustainable tools using real-life examples. The project vScientist will develop a comprehensive VR platform for enhancing the learning and accessibility of Computational Fluid Dynamics (CFD) for students and individuals from diverse backgrounds, promoting social inclusion and accessibility in STEM fields. In any engineering academic setting, educators and students eventually face the problem of generating physics-based simulations that take days or weeks and their visualisation and analysis is done on a 2D display and often at one point in time thus reducing the dynamic and inherently 3D data to a static image.

This project which is a collaboration between the National Technical University of Athens (NTUA) and MultiFluidX will develop a straightforward, semi-automated workflow for enhanced viewing of CFD results and associated data in an immersive virtual environment (IVE). Through this revolutionary platform, users will visualise, interact, and analyse 3D virtual experiments in an immersive environment, or run their own in seconds using machine learning (ML) without using a single CPU hour and have expensive hardware.

The novel platform and the integrated software tools will enable real-time data transfer and dynamic visualisation and increase by 100% the immersive learning experience as we have seen in our practice. The platform will be accessible to individuals with disabilities and will additionally allow non-expert users to use the developed tools and create engaging educational modules and activities aligned with STEM curriculum standards and learning objectives. This advancement will foster more efficient collaboration between educators, students and researchers promoting social inclusion and accessibility in STEM fields.

The project vScientist is part of the project CORTEX2, which is a unique EU project that sets the basis for future extended collaborative telepresence to allow for remote cooperation in numerous industries, both for productive work and education and training.

The idea of CORTEX2 merges the concepts of classical video-conferencing with extended reality, where real assets such as objects, machines or environment can be digitalised and shared with distant users for teamworking in a real-virtual continuous space.

In the Virtual Reality mode, participants are able to create virtual meeting rooms where each user is represented by a virtual avatar. Participants have the possibility to appear as video-based holograms in the virtual rooms, with an option to anonymise their appearance using a AI-based video appearance generator while keeping their original facial expressions.

The platform gives various capabilities to the participants for exchanging documents, 3D objects and other assets together with an AI-powered meeting assistant with extended capabilities such as natural speech interaction, meeting summarisation or translation.

CORTEX2 is an eight-million-euro initiative funded by the European Commission through the Horizon Europe research and innovation programme. The consortium is composed of 10 organisations across seven countries that will work together for 36 months. Read more at cortex2.eu