Patterns and variability of carbon flux and carbon budget vary spatial and temporal scales. In studies on distribution pattern versus spatiotemporal scale, it is difficult to express spatiotemporal dynamics by conventional representation methods.
Du et al. (2015) proposed a new visualization method to show the spatiotemporal dynamics and carbon ﬂuxes of carbon sinks and carbon sources in the marine carbon cycle. Based on this method, the air-sea carbon budget and its accumulation processes can be demonstrated in the spatial dimension, whereas the distribution patterns and variations of air-sea carbon flux are characterized by color changes. Spatial and temporal characteristics of satellite data are then harmonized through visualization. A GPU-based direct volume rendering technique using half-angle slicing is applied to visualize the released or absorbed CO2 gas dynamically. In addition, Du et al. (2015) further designed a data model to generate four-dimensional (4D) data from satellite-derived air-sea CO2 ﬂux products and proposed an out-of-core scheduling strategy for on-the-ﬂy rendering of time-series satellite data.
To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, Wu et al. (2017) developed a cloud-driven digital Earth platform to support the interactive analysis and display of multi-geospatial data, and proposed a primary visualization method based on the digital Earth platform to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to display the released or absorbed CO2 gas dynamically. To enable location-aware visualization within the virtual globe, Wu et al. (2017) proposed a 3D particle-mapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA for real-time rendering was designed to achieve smoothing effects in both spatial and temporal dimensions.
Zhang et al. (2019) constructed a 3D spatiotemporal visualization model to express spatiotemporal variations in particle systems. In the temporal dimension, visualization of the dynamic real-time changes was realized by changing particle attributes, such as color, displacement, and size, and calculating the differences between data at adjacent points in different frames. A CUDA parallel computing model is then introduced to achieve large-scale particle variation visualization and enable high-quality dynamic display. This model reduces CPU load and significantly improves the frame rate. Moreover, this model provides virtual simulation and user interaction rendering, which can dynamically change the environment data type, rendering mode, drawing summary, and other functions.
Du, Z., Fang, L., Bai, Y., Zhang, F., & Liu, R. (2015). Spatio-temporal visualization of air–sea CO2 flux and carbon budget using volume rendering. Computers & Geosciences, 77, 77-86.
Wu, S., Yan, Y., Du, Z., Zhang, F., & Liu, R. (2017). Spatiotemporal visualization of time-series satellite-derived CO2 flux data using volume rendering and GPU-based interpolation on a cloud-driven digital Earth. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 4, 77.
Zhang, F., Mao, R., Du, Z., & Liu, R. (2019). Spatial and temporal processes visualization for marine environmental data using particle system. Computers & Geosciences, 127, 53-64.