Speaker
Description
Based on deep generative adversarial networks, we present a comprehensive framework for image multiscale fusion and multi-component auto-segmentation, which aims to create a precise digital rock − a key part of Digital Rock Physics to predict the petrophysical properties of porous media. Compared to low-resolution images with a large field of view (FoV), high-resolution (HR) rock images are limited in number and FoV. Thus, we first use a Style generative adversarial network (StyleGAN) to augment LR images. Then, a Cycle-consistent GAN (CycleGAN) is subsequently used to fuse multiresolution images to overcome the inherent trade-off between image resolution and field-of-view, using the unpaired real-world LR/HR images and augmented HR data. It is an efficient approach to enhance LR images over large FoV and reconstruct a 3D multiscale model, eliminating the limitation of super-resolution (SR) relying on paired images. Both HR image augmentation and multiscale fusion are conducted over greyscale images. Consequently, we trained a pix2pix network to realize multi-component auto-segmentation based on the labelled images artificially segmented into multiple minerals, along with their corresponding HR images. The workflow is quantitively validated over shale images at each above-mentioned section through petrophysical properties, such as the porosity (area or volume fraction of different minerals), two-point correlation function, pore size distribution, and apparent permeability, showing that the synthetic HR images and SR images are comparable to the ground truth. The trained pix2pix network can accurately capture the spatial distribution and morphology of different minerals. The proposed workflow provides a reliable pathway to reconstruct multiscale and multi-component digital rocks with large FoV for further pore-scale microstructure properties and fluid flow mechanism estimation, in the field of underground energy development, hydrogen storage, and carbon sequestration.
Country | China |
---|---|
Conference Proceedings | I am interested in having my paper published in the proceedings. |
Acceptance of the Terms & Conditions | Click here to agree |