Enter Zoom Meeting

ESSI4.2

Innovations in Scientific Data Visualization

Data visualization is fundamental to science: to discovery, to interpretation, and to communication. This is especially true in this era of big and complex datasets, increasing scientific specialization and the need to effectively communicate ideas and results to diverse audiences.

This session will explore innovations in science data visualization that support and advance elements of the scientific process – from exploration and analysis to discovery and communication. We invite a broad spectrum of science visualization breakthroughs that allow researchers and others to broaden their understanding of natural phenomena and communicate that understanding.

Topics to include (but not limited to):

- Visualizing massive datasets
- Combined visualization of multiple datasets
- Extracting new and/or additional meaning through visualization
- Visualization and human perception
- Focused research and development activities regarding visualization elements (e.g., colors, patterns, shading, 3D, etc.)
- New and inventive science visualization tools and applications (e.g., virtual, augmented and mixed reality: VR/AR/MR)

Public information:
Data visualization is fundamental to science: to discovery, to interpretation, and to communication. This is especially true in this era of big and complex datasets, increasing scientific specialization and the need to effectively communicate ideas and results to diverse audiences.

This session will explore innovations in science data visualization that support and advance elements of the scientific process – from exploration and analysis to discovery and communication. We will showcase science visualization breakthroughs that allow researchers and others to broaden their understanding of natural phenomena and communicate that understanding.

Topics include (but not limited to):
- Visualizing massive datasets
- Combined visualization of multiple datasets
- Extracting new and/or additional meaning through visualization
- Visualization and human perception
- Focused research and development activities regarding visualization elements (e.g., colors, patterns, shading, 3D, etc.)
- New and inventive science visualization tools and applications (e.g., virtual, augmented and mixed reality: VR/AR/MR)

Convener: Rick Saltus | Co-convener: Shayna Skolnik
Attention: the start of this vPICO session is in a shared 90-minute time block where the previous session uses the first 45 minutes and this session the second 45 minutes. Both session share the same Zoom Meeting. If you enter this Zoom Meeting before your session starts, you will join a running session.
Welcome to this vPICO session. All conveners, speakers, and attendees join the Zoom Meeting for the live presentations through the green button to the top right. On this page, you will find a list of presentations, their abstracts linked, and you can use the handshake to start spontaneous chats with others.

Activation of the text chat sets a cookie in your browser that is automatically deleted at the end of the conference.

A chat user is typing ...
SHIFT+ENTER for line break
We are sorry but we encountered a problem while running the chat ESSI4.2 . Please reload this browser window. In case this message is shown again after reloading, please contact us at: egu21@copernicus.org. We are sorry for this inconvenience.

Wed, 28 Apr, 16:15–17:00

Chairpersons: Shayna Skolnik, Rick Saltus

16:15–16:20
5-minute convener introduction

16:20–16:22
|
EGU21-7859
|
ECS
Marcel Meyer et al.

We present the application of interactive 3-D visual analysis techniques using the open-source meteorological visualization framework Met.3D [1] for investigating ERA5 reanalysis data. Our focus lies on inspecting atmospheric conditions favoring the development of extreme weather events in the Arctic. Marine Cold Air Outbreaks (MCAOs) and Polar Lows (PLs) are analyzed with the aim of improving diagnostic indices for capturing extreme weather events in seasonal and climatological assessments. We adopt an integrated workflow starting with the interactive visual exploration of single MCAO and PL events, using an extended version of Met.3D, followed by the design and testing of new diagnostic indices in a climatological assessment. Our interactive visual exploration provides insights into the complex 3-D shape and dynamics of MCAOs and PLs. For instance, we reveal a slow wind eye of a PL that extends from the surface up into the stratosphere. Motivated by the interactive visual analysis of single cases of MCAOs, we design new diagnostic indices, which address shortcomings of previously used indices, by capturing the vertical extent of the lower-level static instability induced by MCAOs. The new indices are tested by comparison with observed PLs in the Barents and the Nordic Seas (as reported in the STARS data set). Results show that the new MCAO index introduced here has an important advantage compared with previously used MCAO indices: it is more successful in indicating the times and locations of PLs. We thus propose the new index for further analyses in seasonal climate predictions and climatological studies. The methods for interactive 3-D visual data analysis presented here are made freely available for public use as part of the open-source tool Met.3D. We thereby provide a generic tool that can be used for investigating atmospheric processes in ERA5 data by means of interactive 3-D visual data analysis. Met.3D can be used, for example, during an initial explorative phase of scientific workflows, as a complement to standard 2-D plots, and for detailed meteorological case-analyses in 3-D.


[1] http://met3d.wavestoweather.de, https://collaboration.cen.uni-hamburg.de/display/Met3D/

How to cite: Meyer, M., Polkova, I., and Rautenhaus, M.: Interactive 3-D visual analysis of ERA 5 data: improving diagnostic indices for Marine Cold Air Outbreaks, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-7859, https://doi.org/10.5194/egusphere-egu21-7859, 2021.

16:22–16:24
|
EGU21-8801
Emmanuel Delage et al.

Resilience to natural hazards depends on a person's ability to envision an event and its consequences. While real life experience is precious, a real event experience is rare, and sometimes fatal. So, virtual reality provides a way to getting that experience more frequently and without the inconvenience of demise. Virtual reality can also enhance an event to make it more visible, as often things happen in bad weather, at night or in other inconvenient moments.

The 3DTeLC software (an output from an ERASMUS+ project, http://3dtelc.lmv.uca.fr/) can handle high-resolution 3D topographic models and the user can study natural hazard phenomena with geological tools in virtual reality. Topography acquired from drone or plane acquisitions, can be made more accessible to researchers, public and stakeholders. In the virtual environment a person can interact with the scene from the first person, drone or plane point of view and can do geological interpretation at different visualization scales. Immersive and interactive visualization is an efficient communication tool (e.g. Tibaldi et al 2019 – Bulletin of Volcanology DOI: https://dx.doi.org/10.1007/s00445-020-01376-6).

We have taken the 3DTeLC workflow and integrated a 2.5D flow simulation programme (VOLCFLOW-C). The dynamic outputs from VOLCFLOW-C are superimposed into a single visualization using a new tool developed from scratch, which we call VRVOLC. This coupled visualization adds dynamic and realistic understanding of events like lahars, lava flows, landslides and pyroclastic flows. We present two examples of this, one developed on the Digital Terrain Model of Chachani Volcano, Arequipa Peru, to assist with flood and lahar visualisation (in conjunction with INGEMMET, UNESCO IGCP project 692 Geoheritage for Resilience and Cap 20-25 Clermont Risk). And another with an Icelandic debris slide that occurred in late 2014 possibly related to permafrost degradation (in conjunction with the ANR PERMOLARDS project).

We thank out 3DTeCL colleagues, without which this would not be possible, and acknowledge financial support for the PERMOLARDS project from French National Research Agency (ANR-19-CE01-0010), and this is part of UNESCO IGCP 692 Geoheritage for Resilience.

How to cite: Delage, E., Van Wyk de Vries, B., Philippe, M., Conway, S., Morino, C., Manrique Llerena, N., Aguilar Contreras, R., Soncco, Y., Sæmundsson, Þ., and Kristinn Helgason, J.: Visualising and experiencing geological flows in Virtual Reality, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-8801, https://doi.org/10.5194/egusphere-egu21-8801, 2021.

16:24–16:26
|
EGU21-10505
|
ECS
ying Han et al.

The power line harmonic generated by human activities can be found from the vast amount of the data observed by EFD on board the ZH-1 satellite. To study the human activities and remove the nonnegligible amount of interferences in the study of ionospheric precursors of earthquakes, we are desperate for finding the power line harmonic from the vast amount of data Hence, a novel automatic power line recognition method is proposed. Firstly, we utilize fourier transform on EFD data to obtain the power spectral density(PSD). Secondly, it is well known that harmonic radiation from power lines presents one or more horizontal linear characteristics on the PSD image and the color of the line is close to the color of the background in the image.In order to highlight the color contrast between the line and the background, we transform the PSD image from the RGB to the HSV color space and utilize the Saturation compoment of the HSV space as the object image.To obtain the edge regions, we process the object image with canny techniques. Finally, we use the Hough transform to detect the power line from the edge regions. To evaluate the proposed method, the experiment is performed for the dataset composed of 100 PSD images and each PSD image includes several interference lines. And the experimental result verifies the effectiveness of the proposed method with an accuracy of 86%.

How to cite: Han, Y., Yuan, J., Wang, Q., Yang, D., and Sun, X.: Automatic Recognition of Power Line Harmonic Radiation Observed by the EFD On board the ZH-1 Satellite, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-10505, https://doi.org/10.5194/egusphere-egu21-10505, 2021.

16:26–16:28
|
EGU21-11208
|
Highlight
Fabio Crameri et al.
  • Does visualisation hinder scientific progress?
  • Is visualisation widely misused to tweak data?
  • Is visualisation intentionally used for social exclusion?
  • Is visualisation taken seriously by academic leaders?

Using scientifically-derived colour palettes is a big step towards making it obsolete to even ask such brutal questions. Their perceptual uniformity leaves no room to highlight artificial boundaries, or hide real ones. Their perceptual order visually transfers data effortlessly and without delay. Their colour-vision deficient friendly nature leaves no reader left wondering. Their black-and-white readability leaves no printer accused of being not good enough. It is, indeed, the true nature of the data that is displayed to all viewers, in every way.

The “Scientific colour map” initiative (Crameri et al., 2020) provides free, citable colour palettes of all kinds for download for an extensive suite of software programs, a discussion around data types and colouring options, and a handy how-to guide for a professional use of colour combinations. Version 7 of the Scientific colour maps (Crameri, 2020) makes crucial new additions towards fairer and more effective science communication available to the science community.

Crameri, F., G.E. Shephard, and P.J. Heron (2020), The misuse of colour in science communication, Nature Communications, 11, 5444.

Crameri, F. (2020). Scientific colour maps. Zenodo. http://doi.org/10.5281/zenodo.1243862

How to cite: Crameri, F., Shephard, G., and Heron, P.: The “Scientific colour map” Initiative: Version 7 and its new additions, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-11208, https://doi.org/10.5194/egusphere-egu21-11208, 2021.

16:28–16:33
|
EGU21-11680
|
ECS
|
solicited
Riccardo Fellegara et al.

Over the last few years, the amount of large and complex data in the public domain has increased enormously and new challenges arose in the representation, analysis and visualization of such data. Considering the number of space missions that provided and will provide remote sensing data, there is still the need of a system that can be dispatched in several remote repositories and being accessible from a single client of commodity hardware.

To tackle this challenge, at the DLR Institute for Software Technology we have defined a dual backend frontend system, enabling the interactive analysis and visualization of large-scale remote sensing data. The basis for all visualization and interaction approaches is CosmoScout VR, a visualization tool developed internally at DLR, and publicly available on Github, that allows the visualization of complex planetary data and large simulation data in real-time. The dual component of this system is based on an MPI framework, called Viracocha, that enables the analysis of large data remotely, and allows the efficient network usage about sending compact and partial results for interactive visualization in CosmoScout as soon as they are computed.

A node-based interface is defined within the visualization tool, and this lets a domain expert to easily define customized pipelines for processing and visualizing the remote data. Each “node” of this interface is either linked with a feature extraction module, defined in Viracocha, or to a rendering module defined directly in CosmoScout. Being this interface completely customizable by a user, multiple pipelines can be defined over the same dataset to enhance even more the visualization feedback for analysis purposes.

Being an ongoing project, on top of these tools, as a novel strategy in EO data processing and visualization, we plan to define and implement strategies based on Topological Data Analysis (TDA). TDA is an emerging set of technique for processing the data considering its topological features. These include both the geometric information associated to a point, as well all the non-geometric scalar values, like temperature and pressure, to name a few, that can be captured during a monitoring mission. One of the major theories behind TDA is Discrete Morse Theory, that, given a scalar value, is used to define a gradient on such function, extract the critical points, identify the region-of-influence of each critical point, and so on. This strategy is parameter free and enables a domain scientist to process large datasets without a prior knowledge of it.

An interesting research question, that it will be investigated during this project is the correlation of changes of critical points at different time steps, and the identification of deformation (or changes) across time in the original dataset.

How to cite: Fellegara, R., Flatken, M., De Zan, F., and Gerndt, A.: Interactive visualization and topology-based analysis of large-scale time-varying remote-sensing data: challenges and opportunities, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-11680, https://doi.org/10.5194/egusphere-egu21-11680, 2021.

16:33–16:35
|
EGU21-12491
Felicia Brisc and Nuno Serra

Virtual Reality is expanding rapidly in many academic and industry areas as an important tool to represent 3D objects, while graphics hardware is becoming increasingly accessible. Conforming to these trends, we present an immersive VR environment created to help earth scientists and other users to visualize and study ocean simulation data and processes. Besides scientific exploration, we hope our environment will become a helpful tool in education and outreach. We combined a 1-year 3km MITgcm simulation with daily temporal resolution and a bathymetry digital elevation model in order to visualize the evolution of  Northeast Atlantic eddies enclosed by warm and salty Mediterranean Water. Our approach leverages the advanced rendering algorithms of a game engine in order to enable users to move around freely, interactively play the simulation and observe the changes and evolution of eddies in real time.

How to cite: Brisc, F. and Serra, N.: Immersive Visualization of Ocean Data in a Game Engine, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-12491, https://doi.org/10.5194/egusphere-egu21-12491, 2021.

16:35–16:37
|
EGU21-15612
Angelika Heil and Augustin Colette
16:37–16:39
|
EGU21-14761
|
ECS
|
Niklas Hohmann and Emilia Jarochowska

Fossil accumulations can be generated by (1) high input of organism remains or (2) by low sedimentation rates, reducing the volume of sediment between individual fossils. This creates a paradox, in which shell beds may form in environments with low biomass production. This effect of sedimentary condensation on fossil abundance is easy to understand, however, its implications are hard to grasp and visualize.

We present the shellbed condensator ( https://stratigraphicpaleobiology.shinyapps.io/shellbed_condensator/ ), a web application that allows to interactively visualize and animate the effects of sedimentary condensation and erosion on fossil abundance and proxies recorded by the sedimentary record. It is an adaptation of the seminal computer simulation by Kidwell (1985). The application is written in R Software and uses the shiny package for the construction of the web interface and the DAIME package for the sedimentological model (Hohmann, 2021). It allows creating stratigraphic expressions and age models for combinations of fossil input and sedimentation rates defined by the user.

To assess the utility of shiny apps for teaching purposes, we examine student understanding of sedimentary condensation after unsupervised studying and after unsupervised usage of the app. Due to their strong visual and interactive components, shiny apps are a powerful and versatile tool for science communication, teaching, self-study, the visualization of large datasets, and the promotion of scientific findings.

 

How to cite: Hohmann, N. and Jarochowska, E.: Visualizing Sedimentary Condensation , Dilution, and Erosion using Shiny Apps, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-14761, https://doi.org/10.5194/egusphere-egu21-14761, 2021.

16:39–16:41
|
EGU21-15802
Björn Wieczoreck

Fully understanding a complex 3D geological model (such as triangulated irregular networks or boundary representations) requires a largely hands-on approach. The user needs direct access to the model and a way to manipulate it in 3D space, e.g. through rotation, to find the appropriate and most useful perspectives. Indirect means of presentation, e.g. via animation, can only give the user a vague idea of the model and all its details, especially with the growing amount of data incorporated. Additionally, discussing such a model with colleagues is often restricted by the space in front of the monitor of the system running the modeling software. And while the accessibility of such models has been improved, e.g. through access via ordinary web browsers, new technologies such as VR and AR could open up novel and improved ways for users to experience and share them.

Although VR has found its way into the mainstream, especially for entertainment, it continues to be a relatively inaccessible technology. The high upfront cost, the need to isolate oneself from the surrounding environment, and involved technical requirements detract from the end goal of improving the accessibility of 3D geological models. On the other hand, more and more common handheld devices such as smartphones and tablets support AR and thus lower the barrier of entry for a large number of people. To analyze the potential of AR for the presentation and discussion of 3D geological models, a mobile app has been developed.

Started as a prototype during a geoscience hackathon, the app has now been rewritten from scratch and was uploaded to the iOS App Store. During the conceptualization phase of the features, the immense potential already became apparent. The app itself allows users to download a number of 3D geological models to their device and explore them in AR. They then have the possibility to share this model with up to seven other peers in the same room. This means that every user will see the model in the same space and in the same state. As soon as one user changes e.g. the size or rotation of the model, the new state will be synchronized with every connected peer. Discussion is aided by a "pointing" and "highlighting" feature to assure that everyone is talking about the same model part. The models are either stored on the device or can be downloaded via internet. For now, the models are supplied by GiGa infosystem's GST Web, but additional sources are being explored.

The delivery of the app with this basic featureset invites first user feedback and allows for a better exploration of possible applications. For example, viable use cases of this app can be found in academia as an easier way to communicate 3D models to students, during conferences as a presentation platform to give peers a guided tour of a model, or in modeling where advanced features such as digital boreholes or cross-sections can help verify intermediate results.

How to cite: Wieczoreck, B.: Collaborative Visualization of 3D Geological Models in Augmented Reality, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-15802, https://doi.org/10.5194/egusphere-egu21-15802, 2021.

16:41–17:00
Meet the authors in their breakout text chats

A chat user is typing ...