EOS1.5 | Immersive Geoscience: Virtual, Augmented and Mixed Reality in Education, Outreach and Research
EDI
Immersive Geoscience: Virtual, Augmented and Mixed Reality in Education, Outreach and Research
Co-organized by ESSI2
Convener: Thomas Heinze | Co-conveners: Alireza Arab, Ilja Kogan, Emma Cieslak-JonesECSECS, Alissa KotowskiECSECS
Posters on site
| Attendance Wed, 06 May, 10:45–12:30 (CEST) | Display Wed, 06 May, 08:30–12:30
 
Hall X5
Wed, 10:45
Virtual, and augmented, and mixed reality (VR/AR/MR), along with immersive visualization environments, and related interactive techniques are rapidly transforming the way we communicate, teach, explore, and conduct geoscientific research. These technologies allow learners, stakeholders, and researchers to experience complex processes and datasets in new and engaging ways, from exploring outcrops and landscapes virtually to interacting with model simulations in three dimensions. Their potential spans education and outreach, where they enhance accessibility and engagement for diverse audiences, as well as scientific research itself, where immersive environments can provide novel perspectives on data analysis, hypothesis testing, and collaborative work.

This session invites contributions showcasing innovative uses of virtual reality, augmented reality, mixed reality, and other immersive techniques across the geosciences. We particularly welcome examples from education, outreach, citizen science, and communication, as well as case studies and technical developments that demonstrate the added value of immersive approaches for scientific discovery. We also encourage critical perspectives addressing challenges such as accessibility, reproducibility, sustainability, and pedagogy. The session aims to foster exchange between developers and users, between research and education, and across disciplinary boundaries to highlight best practices and future opportunities for immersive technologies in geoscience.

Posters on site: Wed, 6 May, 10:45–12:30 | Hall X5

The posters scheduled for on-site presentation are only visible in the poster hall in Vienna. If authors uploaded their presentation files, these files are linked from the abstracts below.
Display time: Wed, 6 May, 08:30–12:30
Chairpersons: Thomas Heinze, Alireza Arab
X5.258
|
EGU26-2189
Zahrah A. Almusaylim, Rawan Alajmi, Nouf Alsinan, Wafa Alajmi, and Ahad Alnasser

Arabic Earth Now (AEN) is an interactive data visualization platform, initially developed as a localization version of NASA’s Eyes to visualize satellite data and provide learning about Earth and space science. AEN is further extended into a geo-dome globe simulator to support immersive, spatially rich learning experiences. Despite advances in geo-visualization, there remains a critical gap in research on how cultural localization and immersive presentation formats influence geoscience education, particularly among Arab learners. Moreover, the potential of localized platforms to enhance awareness of national scientific contributions remains underexplored. Hence, our contribution in this study how interaction with AEN and the simulator influences user engagement and learning outcomes. It contributes new insights into the role of culturally relevant data visualization in geoscience education. We examined the impact of AEN in a festival educational event to assess the students' engagement with AEN. Students of primary, inetrmeidate, secondary and univerysity graduation levels, were assigned to interact with AEN and the simulator in order to assess their impact before and after engagement. The students first experience the platform, and subsequently, they provide their feedback about their experience via an anonymous questionnaire. Students have shown a high level of engagement after interacting with AEN and indicate their motivation and higher intentions to reuse it again. Our findings demonstrate that AEN offers a highly engaging educational experience, as evidenced by the collected data. However, the analysis reveals a gap in effectively integrating geo-education with geo-visualization tools to enhance student participation. These results underscore the critical role of data visualization in enriching educational content and suggest that its strategic implementation can significantly improve both student and educator engagement within geoscience learning environments and fostering greater engagement. Additionally, the study highlights the value of localizing NASA’s Eyes as AEN to serve as an effective tool for geoscience learning.

How to cite: A. Almusaylim, Z., Alajmi, R., Alsinan, N., Alajmi, W., and Alnasser, A.: Evaluating the Impact of a Culturally Localized Geo-Visualization Platform on Geoscience Learning: The Arabic Earth Now Platform Study, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-2189, https://doi.org/10.5194/egusphere-egu26-2189, 2026.

X5.259
|
EGU26-3909
Thais Siqueira, Juana Vegas, Gonzalo Lozano, Carmen Romero, Ana Cabrera, Rayco Morrero, Nieves Sánchez, Ramón Casillas, Olaya Dorado, David Sanz-Mangas, Lucía Saez-Gabarrón, and Inés Galindo

The volcanic landscapes of the Canary Islands constitute one of the most 
distinctive geoheritage assemblages in Spain, underpinning the scientific, 
educational, and touristic values that have contributed to the Spanish Inventory of 
Geological Sites of Interest (IELIG), multiple natural protected areas and UNESCO 
designations. These volcanic features not only embody an extensive record of 
geological processes but also offer an exceptional basis for sustainable tourism 
initiatives. In this context, the project ‘Canary Islands: Destination of Volcanoes’, 
seeks to establish a science-based geotourism product capable of enhancing 
public engagement while strengthening the conservation and responsible use of 
natural resources. The project employs a comprehensive methodology structured 
into nine main activities that integrate fieldwork, analytical procedures, and digital 
data processing. Building on the 300 geosites identified in the IELIG for the Canarian 
Archipelago, a specific assessment framework has been designed to select the 50 
volcanic environments with the highest scientific, educational, and tourist
potential. This process combines standards and requirements of sustainability, 
conservation status, degradation risk, accessibility, safety, and scenic-scientific
values. The selected sites are being documented through the development of digital 
mapping products, adhering to international standards for spatial data quality and 
metadata. Complementary tasks include the acquisition of high-resolution drone 
imagery, photogrammetry, and 3D geological reconstructions that support the 
creation of virtual and augmented reality models. These digital products will serve 
to design interpretive scripts, animations, and immersive environments that aim to 
communicate complex geological processes in an accessible way to the general 
public. Additional activities address the creation of a unified geotourism brand, 
development of training programmes for local employment, and support for 
emerging business initiatives in the blue and green economy. Although the results 
are still in progress, the project is expected to generate a robust portfolio of 
scientifically validated and technologically innovative tools that enhance the 
touristic use and outreach of volcanic heritage. The integration of digital maps, 
VR/AR applications and scientific communication through innovation has the 
potential to diversify the regional geotouristic model, reduce environmental impact, 
and strengthen long-term conservation strategies. Ultimately, this initiative aspires 
to position the Canary Islands as an international reference for volcano-based 
geotourism grounded in science, sustainability, and innovation.
Sub-Project 1 ‘Canary Islands, destiny of Volcanoes’ is funded by PROMOTUR 
Turismo Canarias, S.A. through Next Generation EU funds, PRTR. 2024krQ00nnn.

How to cite: Siqueira, T., Vegas, J., Lozano, G., Romero, C., Cabrera, A., Morrero, R., Sánchez, N., Casillas, R., Dorado, O., Sanz-Mangas, D., Saez-Gabarrón, L., and Galindo, I.: Transforming volcanic landscapes into knowledge: geoheritage, virtual reality technologies, and geotourism in the Canary Islands, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-3909, https://doi.org/10.5194/egusphere-egu26-3909, 2026.

X5.260
|
EGU26-3084
|
ECS
Júlia Sánchez-Martínez, Josep Maria-García, Jaume Cusachs, Carlos Marín, Arnau Lagresa, Marta López-Saavedra, Xavier Arnau-Sarabia, Marc Martínez-Sepúlveda, Iris Schneider-Pérez, Mireia Jiménez-Llobet, and Joan Martí

Traditional 2D hazard maps often struggle to convey the complex spatial dynamics of natural hazards, particularly for users who are not accustomed to interpreting cartographic products. This limitation hinders effective risk communication and reduces the ability of local stakeholders to identify exposed areas. To address this challenge, we develop a 3D visualisation framework that transforms model outputs into intuitive, interactive representations aimed at supporting preventive planning and informed decision making.

As an initial implementation, we apply the approach to lava-flow modelling. Using one million Monte Carlo simulations, we estimate the probabilistic envelope of potential lava trajectories and extract a random subset of 10.000 paths to obtain a representative sample of the most recurrent routes. Each trajectory is interpreted as the path of a lava droplet and its interactive 3D rendering highlights the most likely flow channels of a specific simulation. By integrating infrastructure, municipalities, roads and buildings directly within the 3D environment, the tool enables non-expert users to visualise potential scenarios with greater clarity and anticipate protective actions.

The system is hazard-agnostic and constitutes a core component of a developing multi-risk evaluation platform that will incorporate spatial and temporal analyses, simulation modelling and automated 3D representations for multiple natural hazards. The 3D representation can be extended to other hazard types, offering a general framework to bridge the gap between simulation-based hazard analysis and accessible 3D communication tools.

This study was developed within the project Volcanic disaster risk management for the Canary Islands (Spain), funded by EC ECHO - Union Civil Protection Mechanism (UCPM), ref. 101193100 VOLCAN (2025-2026).

How to cite: Sánchez-Martínez, J., Maria-García, J., Cusachs, J., Marín, C., Lagresa, A., López-Saavedra, M., Arnau-Sarabia, X., Martínez-Sepúlveda, M., Schneider-Pérez, I., Jiménez-Llobet, M., and Martí, J.: Making hazard maps more intuitive: A 3D interactive visualisation framework for representing hazard flows, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-3084, https://doi.org/10.5194/egusphere-egu26-3084, 2026.

X5.261
|
EGU26-4335
|
ECS
Hiroyuki Yamauchi, Kotaro Iizuka, and Takuro Ogura

To understand disaster damage, ground-based and aerial photographs taken during or after hazard events are commonly used. These images are also valuable for geography education. In particular, university students in a teacher training course are required to understand the characteristics of disasters, as they will be responsible for teaching these topics to their pupils. However, some students have difficulty achieving a sufficient understanding of actual disasters because of differences in scale among photographs taken from airplanes, drones, and ground-level viewpoints. To facilitate students’ understanding of disasters, it is necessary to develop a teaching program and educational materials that can connect geospatial products across multiple spatial scales. In this study, we designed a one-day workshop program integrating GIS and VR technologies. The workshop enabled students to learn about the impacts and damage of the 2024 Noto Peninsula Earthquake while operating two GIS applications and a VR device, allowing them to observe the area from different viewing perspectives. The workshop consisted of four parts: (1) a lecture on basic concepts of GIS and remote sensing, (2) a short lecture summarizing the 2024 Noto Peninsula Earthquake and a WebGIS-based comparison of aerial photographs taken before and after the disaster, (3) visualization of damaged buildings and terrain using QGIS, and (4) five minutes of VR-based fieldwork using a head-mounted display. Each section lasted 90 minutes. The second section was conducted in groups of four students. This workshop was conducted as part of a graduate school course in a teacher training program, with a total of eight students participating. Students’ learning outcomes in each section were assessed through a questionnaire survey. The results indicate that although individual materials have limitations in representing regional characteristics, integrating educational materials across multiple spatial scales deepened students’ understanding of the disaster. In particular, VR-based fieldwork enhanced students’ understanding of actual disaster damage, such as collapsed buildings.

How to cite: Yamauchi, H., Iizuka, K., and Ogura, T.: Implementation of a Workshop for Disaster Education on the 2024 Noto Peninsula Earthquake Using Multi-Scale Geospatial Products Integrating GIS and Immersive VR, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-4335, https://doi.org/10.5194/egusphere-egu26-4335, 2026.

X5.262
|
EGU26-4938
Ariane Dubost, Misha Faber, Sabine Philippin, Galane Peyre, Zhuoqun Wu, and Dimitrii Krasnov

The rapid advances in computing, multimedia, and virtual reality technologies provides new opportunities for communicating and visualising scientific information. Virtual Tours (VTs), based on 360-degree imagery enriched with multimedia content such as graphical explanations, audio and videos offer a user-friendly way to explore scientific facilities. Interactive navigation enables users to understand research infrastructures from multiple perspectives and at their own pace.

Within ACTRIS-FR, the French component of the European Aerosol, Clouds and Trace Gases research infrastructure (ACTRIS), VTs serve as a tool to make atmospheric research stations more accessible and transparent. Historically, ACTRIS facilities - observation stations, mobile platforms and atmospheric simulation chambers -  have often been associated with limited accessibility due to security, safety, or logistical constraints. VTs break down these barriers by providing a realistic and informative representation of the facilities, enabling students, visiting researchers, and the general public to better understand the scale, layout, and purpose of the instruments and measurements before visiting, or even when travel is restricted.

The approach was developed in collaboration with SMEAR Estonia platform’s developer. The methodology allows the creation of tailored content for different target groups: for example, technicians may access specific data sets/curves, and documentation, while educators and outreach professionals can integrate simplified explanations, posters, links to videos  to support teaching. This flexibility allows a single VT to be easily tailored to various uses, ranging from outreach and training to scientific communication and access preparation.

Several ACTRIS-FR sites already use VTs to strengthen their visibility and foster greater user interaction. The tours can be embedded in websites, communication materials via QR codes and showcased at conferences, exhibitions, or as part of transnational access projects such as the Horizon Europe project IRISCC. They also support ACTRIS’s broader mission of modernising outreach activities and improving interaction with the education sector and the general public.

Developing high-quality VTs poses several challenges, as producing accurate and meaningful content requires significant involvement from scientists and technical staff, along with time-consuming data collection and careful attention to visual resolution and metadata consistency. 

The poster will outline the development of ACTRIS France VTs, discussing both the benefits and limitations, while also exploring opportunities to integrate multimedia. It will also emphasize the value of VTs as training tools for technicians, scientists, and students, and their potential to enhance accessibility, transparency, and cross-country collaborations. The tours not only facilitate a deeper understanding of the work being conducted at the facility, but also contribute to raising general awareness and knowledge about distributed research infrastructures, promoting a broader appreciation of the complex research ecosystem.

 

How to cite: Dubost, A., Faber, M., Philippin, S., Peyre, G., Wu, Z., and Krasnov, D.: Improving ACTRIS Scientific Outreach through Immersive Virtual Tours, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-4938, https://doi.org/10.5194/egusphere-egu26-4938, 2026.

X5.263
|
EGU26-6335
|
ECS
Laura E. Coulson, Eva Feldbacher, Barbara Köck, Gabriele Weigelhofer, Andreas Zitek, and Libor Zavorka

Climate change is reshaping aquatic food webs, altering the dietary quality available to fish and, with it, their cognitive performance, behavior, and fitness. Because wild fish are a critical source of omega-3 polyunsaturated fatty acids (PUFAs) for humans, the ecological and societal relevance of these changes transcends aquatic systems. BrainFood is a science communication initiative that translates the research of the 4FatQs project—on the role of omega-3 PUFAs for cognition in wild fish—into accessible, engaging, and evidence-informed digital learning experiences for broad audiences.

BrainFood deploys a suite of 5 interactive short stories (each ≤5 minutes), built with 360° images and videos hosted in the CenarioVR environment and accessible via web link or QR code on smartphones, tablets, laptops, and optional VR headsets. The stories interlink methods, findings, and implications of 4FatQs through multimodal elements—narrated video, animated gifs, audio overlays, quizzes, and mini-games—allowing non-linear exploration without cognitive overload. Example modules include “A Day in the Life of Trout,” which introduces tracking technologiesto study movement and behavior, and “Hide and Seek!”, a game-based exploration of camouflage and rapid color change in salmonids. Additionally, the stories have a strong focus on how this information was generated – a key element of science literacy. All materials are designed for inclusion and accessibility (high-contrast layouts, dyslexia-friendly fonts, voice-over options, and alternatives for those with hearing impairments).

BrainFood’s originality lies not in technological novelty, but in the strategic integration of: (i) multi-device, low-barrier 360° learning experiences; (ii) targeted deployment through multiplier venues and events; (iii) rigorous, real-time co-creation and optimization; and (iv) explicit alignment with science literacy goals. By foregrounding methods as well as findings, the platform demystifies how aquatic ecologists generate evidence—field observation, mesocosm experiments, laboratory analyses—and reveals cascading links between climate, food quality, cognition, and ecosystem health.

A distinctive feature of BrainFood is its co-creation and evaluation pipeline. The initial pilot set of five stories will be deployed at the Haus der Wildnis visitor center (Lunz am See, Austria) and an additional 10 stories will be created based on the feedback from our pilot users.

How to cite: Coulson, L. E., Feldbacher, E., Köck, B., Weigelhofer, G., Zitek, A., and Zavorka, L.: BrainFood: Semi-immersive, 360° learning experiences to communicate research on fish cognition, food quality, and climate change, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-6335, https://doi.org/10.5194/egusphere-egu26-6335, 2026.

X5.264
|
EGU26-6754
|
ECS
Joeri Brackenhoff, Paula Rulff, Jinqiang Chen, Pierre-Olivier Bruna, Alexandros Daniilidis, Arend-Jan Krooneman, Yosua Pranata Andoko, and Arno Freeke

Geophysics education is often challenging, as it entails explaining complicated physical processes that take place inside the Earth. Because these processes happen below the surface of the Earth, it can be difficult for students to connect to the material and understand what is happening. As a result, it is hard for students to make the link between the abstract explanation of the processes to the physical measurements that are performed during fieldwork.

A novel way to close the gap between theory and fieldwork is the use of Virtual Reality or VR. VR allows a student to fully immerse themselves into a digital twin of reality and to experience and visualize processes that are invisible in real life. This is the purpose of Geoscience Processes Virtual Education or GeoProVE. In this application, we have developed a fully immersive and interactive scenario where a student can learn about Ground Penetrating Radar or GPR. The use performs a GPR measurement along a line and is guided with questions to understand how the data are acquired and why specific patterns arise. One of the major features is the ability to pull the subsurface out of the ground, to see how the waves propagate through the subsurface and interact with objects, such as pipes and the water table, in the subsurface. Several setups with increasing complexity are shown to the students, with a strong emphasis on challenge-based learning through a scoring system.

 

Aside from the GPR scenario, a scenario focused on offshore 3D seismics is also in development for GeoProVE, with the aim to create additional scenarios focused on ERT and geothermal applications. GeoProVE is intended to become fully open source so other developers can contribute to the knowledge base. The application has shown positive engagement from students for geophysics education. We will demonstrate the development of GeoProVE along with its main features.

How to cite: Brackenhoff, J., Rulff, P., Chen, J., Bruna, P.-O., Daniilidis, A., Krooneman, A.-J., Pranata Andoko, Y., and Freeke, A.: GeoProVE – How to use Virtual Reality for Geophysics Education, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-6754, https://doi.org/10.5194/egusphere-egu26-6754, 2026.

X5.265
|
EGU26-9331
|
Highlight
Yachen Zhou, Boyun Yu, and Takashi Oguchi

This study develops an immersive Virtual Reality (VR) pedagogical framework to mitigate the spatial cognitive constraints inherent in conventional two-dimensional disaster-prevention education. It focuses on the 2024 Noto Peninsula landslide events in Japan. By integrating high-precision digital elevation models and satellite imagery, the produced system reconstructs post-disaster terrains to facilitate high-fidelity risk communication. The interaction logic, governed by the natural user interface principles, incorporates a multi-perspective switching mechanism that enables users to conduct comprehensive analyses of disaster sites across varying spatial scales.

 

The system architecture comprises two core modules: the "VR Geological Museum" for knowledge acquisition and the "Evacuation Simulation" for practical application, enabling deep transfer from conceptual understanding to survival skills. The former employs a task-driven strategy and a "macro-micro" dual-perspective observation model. It transforms abstract geological knowledge into intuitive interactive experiences through high-precision 3D reconstructions of landslide topography, effectively lowering the cognitive threshold for non-expert learners. Complementing this, the evacuation simulation module integrates official landslide-disaster warning area maps from the Geospatial Information Authority of Japan. Grounded in embodied cognition theory, this module implements a "trial-and-error" feedback mechanism. By navigating highly restored disaster evolution scenarios, users translate static warning information into dynamic survival capabilities, thereby completing the cognitive loop from theoretical understanding to behavioral practice.

 

The pedagogical efficacy of the system was empirically validated through a randomized controlled trial, utilizing multidimensional standardized metrics, including the Presence Questionnaire, the System Usability Scale, and the NASA Task Load Index for workload assessment. Experimental results demonstrate that the system significantly outperforms traditional text-based media in knowledge internalization, risk perception accuracy, and survival decision-making efficiency. The core contribution of this research lies in the deep integration of high-fidelity geospatial data with immersive interaction, establishing a verifiable technical paradigm for disaster education. This approach effectively dismantles barriers to professional knowledge. It enhances disaster preparedness and evacuation efficacy across diverse demographic backgrounds, providing a robust theoretical and technical foundation for the universalization of geohazard education.

How to cite: Zhou, Y., Yu, B., and Oguchi, T.: The Eye of Disaster: Development and Evaluation of a VR-Based Landslide Education System, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-9331, https://doi.org/10.5194/egusphere-egu26-9331, 2026.

X5.266
|
EGU26-10349
Anita Di Chiara, Greig Paterson, Daniele Thallner, Florencia Milanese, Annique van der Boon, Raquel Bonilla-Alba, Claudio Robustelli-Test, Brendan Cych, Richard Bono, and Lesleis Nagy

The MagNetZ (Magnetic Network on Zoom) webinar series stands as a cornerstone for geomagnetism research, hosting online seminars since the early 2020. Launched amid COVID-19 constraints, MagNetZ are convened by a team of scientists to give visibility to scientific work of both leading scientists and early career researchers, to foster virtual collaboration, overcoming geographical limits for students and professionals alike. It promotes open science sharing, with broad appeal evidenced by international viewership and institutional ties. The presentations typically began with a short talk, followed by interactive Q&A space, which are both recorded, post-edited and published on YouTube (https://www.youtube.com/@MagNetZ) in a continuously growing archive of recorded content. These webinars are also uploaded to the EarthRef.org Digital Archive (ERDA), development and maintained by the EarthRef.org Database Team and can be cited. Thus far, more than 80 webinars are available for viewing. MagNetZ also supports for national meetings, such as the UK-based annual Magnetic Interactions meeting, offering them the platform to view the recordings of three meetings so far. The webinars provide in-depth discussions on paleo- and rock-magnetism and geomagnetic modeling, with topics spanning from geo- and planetary magnetic field dynamics to mineral properties studies for paleoclimatic reconstructions, from paleomagnetic data for geodynamic applications to archaeomagnetism, and more. The core features of MagNetZ are to ensure accessibility for all genders, all career stages and geographical distributions, enhancing community networks, and serving as an educational hub for magnetic data in tectonics and climate studies. Its YouTube platform ensures enduring access, sparking collaborations and awareness.

How to cite: Di Chiara, A., Paterson, G., Thallner, D., Milanese, F., van der Boon, A., Bonilla-Alba, R., Robustelli-Test, C., Cych, B., Bono, R., and Nagy, L.: Virtual Frontiers in Earth Magnetism: The legacy of MagNetZ webinar series , EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-10349, https://doi.org/10.5194/egusphere-egu26-10349, 2026.

X5.267
|
EGU26-13637
Mihai Micu, Cristina Dumitrica, Bianca Mitrica, Gabriela Morosanu, and Irena Roznovietchi

During the last decades, the compound effect of natural hazards such as landslides, floods/flash floods, and earthquakes highlighted a priority field of scientific research (multi-hazard risk), derived from the close connection with their increasingly accentuated impacts on society and the environment. The amplification of consequences as a result of complex interaction mechanisms leads to increased exposure and prolonged recovery time for affected communities, thereby reducing overall resilience. The very recent development of new theoretical-methodological concepts, such as Virtual Reality (VR) offers enhanced opportunities to explore evolutionary processes of landforms and resulting landscapes, enabling the discovery, calibration, and validation of advanced solutions for risk perception, understanding, awareness, communication, and management. In this context, and in response to the challenges of the contemporary period, marked by rapid environmental changes, a new VR platform started to be developed within the SPEER-A (Interreg) project, focusing on the Vrancea seismic region, the most important intermediate-depth seismic source of Europe, an area intensely affected by earthquakes, landslides, and flash floods. The objectives of the VR-GeoLab are: i) to create a VR-based transdisciplinary solution (following a co-creation, co-design, and co-dissemination approach) of real-time interaction of scientific research products with other stakeholders involved in the management of multi-hazard scenarios, which ii) integrates the results of scientific research into a modern, enhanced reality and collaborative knowledge and relational framework, and iii) increase the societal resilience by improving the spatio-temporal perception of the multi-hazard environments through immersive, virtual representations of hazards’ interaction, conditioning factors, exposure, and vulnerability. In this way, VR-GeoLab provides an innovative platform for promoting scientific resultsto a wide range of stakeholders, in a multi-dimensional integrated, interactive, immersive and collaborative way, thus contributing with consistent added value not only for educational promotion and capacity building, but also for opening new research horizons through the integration of advanced digital interaction tools in future applications of international research and educational projects. Acknowledgements: this work is supported by the Interreg NEXT Black Sea Basin Programme under grant agreement no. BSB01197 - Strengthening and Promoting Earthquake Emergency Response and Rescue Capacity in the BSB Area (SPEER-A).

How to cite: Micu, M., Dumitrica, C., Mitrica, B., Morosanu, G., and Roznovietchi, I.: VR-GeoLab: a platform for multi-hazard understanding and risk communication, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-13637, https://doi.org/10.5194/egusphere-egu26-13637, 2026.

X5.268
|
EGU26-17240
Javier Pacheco-Labrador, Eduardo de la Cal Martín, M. Pilar Martín, Tarek S. El-Madany, María Dolores Raya-Sereno, Vicente Burchard-Levine, Lucía Casillas, Juan Ramón Bustos-Caparrós, and Jorge Lagranja

Scientists from different disciplines (e.g., eddy covariance fluxes, remote sensing, or ecology) work together at long-term ecosystem stations to monitor ecosystem responses to climate change. These stations are heavily equipped with automated sensors that continuously measure, and they support regular campaigns in which scientists take numerous samples and measurements. Networks of these stations have provided a critical understanding of ecosystems' responses to extreme events and other consequences of climate change, and therefore, society must be aware of the relevance of this kind of infrastructure. However, presenting these stations to the general public or students is complex, as they may be located in isolated areas, and hosting large numbers of visitors can perturb the ecosystem, affecting observations and their interpretation. Furthermore, the diversity of topics and knowledge gathered in these stations can overwhelm communication.

In this context, virtual reality offers unmatched advantages to bring the general public to these research stations from anywhere. We present the “Sentinel Dehesa” virtual tour, a virtual reality environment of the ecosystem station at Majadas de Tiétar, in Cáceres, Spain, which is included in the International Carbon Observatory System (ICOS). The station monitors a Mediterranean savanna, an agroecosystem characterized by its sustainability but jeopardized by climate change. The station continuously measures surface-atmosphere energy, carbon, and water fluxes using micrometeorological and eddy covariance techniques. Furthermore, remote sensing scientists conduct regular campaigns to measure vegetation spectral and biophysical properties and relate them to satellite imagery. In this virtual environment, visitors can learn about the sensors and measurements performed on the site as they move through different information points that provide multilingual content.

This virtual tour is available both for VR goggles and web browsers (https://speclab.csic.es/en/) and has been used for educational and outreach activities, attracting the interest of secondary students and being highly valued by their teachers.

How to cite: Pacheco-Labrador, J., de la Cal Martín, E., Martín, M. P., El-Madany, T. S., Raya-Sereno, M. D., Burchard-Levine, V., Casillas, L., Bustos-Caparrós, J. R., and Lagranja, J.: The Sentinel Dehesa: A virtual tour of an ICOS Research Ecosystem Station, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-17240, https://doi.org/10.5194/egusphere-egu26-17240, 2026.

X5.269
|
EGU26-17709
|
ECS
Maximilian Söchting and Miguel D. Mahecha

Many widely used tools for communicating and teaching Earth observation and modeled climate data still struggle to convey spatiotemporal phenomena: visualization is often limited to 2D map views, interfaces can prove difficult for non-experts, and workflows might not be easily transferred from curated examples to large or in-progress research datasets. This creates a gap between public-facing visualization, classroom use and research workflows.

We present Lexcube, a multi-platform ecosystem for interactive exploration and visualization of Earth system "data cubes", i.e., large remote sensing and modelled data sets. Lexcube provides an immersive, interactive 3D "data cube" view where all dimensions (space and time) are treated equally, enabling users to easily reveal spatiotemporal dynamics that are not visible in 2D map-based interfaces. Over its development in the last years, Lexcube has been used in education, outreach, and research. Our goal was to emphasize intuitive navigation and low barriers of entry while being capable of visualizing large data sets with minimal hassles. Lexcube has been deployed in multiple forms: 

  • (1) Lexcube.org, an interactive data cube exploration and visualization web app, with no coding or infrastructure required. It runs on desktop and mobile devices with minimal hardware requirements, and has been regularly used in teaching.
  • (2) Lexcube for Jupyter, an open-source Python package aimed at scientists, that allows to visualize any 3D data set as an interactive data cube in Jupyter notebooks.
  • (3) Two museum exhibits, featuring simplified versions of the Lexcube.org interface with curated data sets and explainer texts relevant to its respective exhibition.
  • (4) A physical interactive data cube, a large museum-style installation that displays data cubes in physical space through five square touch screens assembled in the shape of a cube, offering the same capability and data sets as Lexcube.org, but proving even more accessible as no virtual 3D environment or software has to be navigated at all.
  • (5) The option to create physical paper data cubes from templates generated by Lexcube, assembled by cutting and glueing, offering a low-cost and engaging piece of science communication.

In the future, we are looking to strengthen the education and science communication use cases for the Lexcube platform and are very interested for feedback and ideas for possible future developments. These could include a virtual reality deployment to particularly explore extreme events as 3D voxel clouds over space and time as well as offering simple data processing operators beyond pure data visualization.

How to cite: Söchting, M. and Mahecha, M. D.: Lexcube: A multi-platform "data cube" ecosystem for immersive exploration of Earth system datasets in education, outreach and research, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-17709, https://doi.org/10.5194/egusphere-egu26-17709, 2026.

X5.270
|
EGU26-20973
|
ECS
Amarpal Sahota, Elena Fillola, Adrian K. T. Ng, Jeff Clark, Nawid Keshtmand, Matt Rigby, and Raul Santos-Rodriguez

Climate data is complex to understand and observe with variables covering the 3D surface of the globe and changing over time. Consequently, it is difficult to convey rich climate information through static 2D images. To address this, we designed and built interactive 3D Earth models using the Unity game engine to visualise data from the Tropospheric Monitoring Instrument (TROPOMI) satellite. These virtual world models aim to help researchers share climate insights more effectively and make them accessible to the public. 

The immersive environment presents the use of a number of 3D Earth objects. The first one displays the ‘XCH4’ variable (column-averaged dry-air mole fraction of methane in ppb) for an entire month, allowing the user to cycle through months via a controller. The Earth spins on its axis, automatically displaying methane concentrations, while the user can manually adjust the view to inspect regions of interest. A second Earth object features an automatic animation displaying the density of data points collected by the TROPOMI satellite as days progress. We also render auxiliary reference globes without thematic overlays and include a 2D static plot of atmospheric Methane (ppb) for 2023 for comparison. 

The entire layout is optimised for immersive systems, specifically where the user is positioned centrally within a 360-degree display ring, such as the Reality Emulator at the University of Bristol, a VR-enabled ‘CAVE’ system. Audience feedback thus far has been highly positive: the immersive 3D visualisation gave participants a clearer view of methane concentrations across the Earth and deepened their interest in the planet’s atmosphere. It also sparked curiosity about the factors affecting atmospheric composition, prompting many questions about methane sources and satellite monitoring. This setup demonstrates the potential of virtual reality in communicating high-dimensional earth science data. 

How to cite: Sahota, A., Fillola, E., K. T. Ng, A., Clark, J., Keshtmand, N., Rigby, M., and Santos-Rodriguez, R.: Interactive 3D Earth Models in Unity to Visualise TROPOMI Satellite Climate Data, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-20973, https://doi.org/10.5194/egusphere-egu26-20973, 2026.

X5.271
|
EGU26-23234
Bruce Malamud, Elizabeth Follows, and Finlay Trasler

Teaching hazards and risk often requires engagement with complex, dynamic and inaccessible environments. Virtual reality (VR) provides a practical means of supporting immersive, place-based learning. This contribution presents the use of VR as a facilitated teaching tool within hazards and risk education.

VR sessions were delivered to master's and undergraduate students (one session of 12 students), 2nd year undergraduate students (two sessions of 13 students) and to pre-university Sixth Form students (two sessions of 12 students) using Meta Quest 3 and Quest Pro headsets and the Wander platform of global Google Street (and user uploaded) images. The sessions included virtual visits to hazard-relevant locations, including informal settlements in Kenya, earthquake-affected urban environments in Japan (using before-and-after imagery to examine building tilt), rockfall-prone landscapes in Nepal, time-lapse environmental change in Durham, a broader VR-based field trip to Israel and another session following along the coastline of a Kayaker in Oman. Each activity combined guided VR exploration with structured discussion of hazard processes, exposure, vulnerability and resilience.

The use of VR supported spatial understanding, comparison between contrasting hazard contexts, and student engagement. Key considerations included group size, facilitation, accessibility, and the importance of integrating VR with non-digital teaching methods rather than using VR in isolation. These examples demonstrate how immersive technologies can be effectively incorporated into hazards and risk education across educational levels, while highlighting the need for critical reflection on learning outcomes and evaluation.

How to cite: Malamud, B., Follows, E., and Trasler, F.: Using Virtual Reality to Support Hazards and Risk Education, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-23234, https://doi.org/10.5194/egusphere-egu26-23234, 2026.

Please check your login data.