filename : Zhan24a.pdf entry : inproceedings conference : Eurographics pages : 1-18 year : 2024 month : April title : Neural Denoising for Deep-Z Monte Carlo Renderings subtitle : author : Xianyao Zhang, Gerhard Roethlin, Shilin Zhu, Tunc Ozan Aydin, Farnood Salehi, Markus Gross, Marios Papas booktitle : Computer Graphics Forum ISSN/ISBN : editor : Amit Bermano and Evangelos Kalogerakis publisher : The Eurographics Association and John Wiley & Sons Ltd. publ.place : Computer Graphics Forum volume : 43 issue : 2 language : English keywords : Denoising, Ray tracing, Deep-Z abstract : We present a kernel-predicting neural denoising method for path-traced deep-Z images that facilitates their usage in animation and visual effects production. Deep-Z images provide enhanced flexibility during compositing as they contain color, opacity, and other rendered data at multiple depth-resolved bins within each pixel. However, they are subject to noise, and rendering until convergence is prohibitively expensive. The current state of the art in deep-Z denoising yields objectionable artifacts, and current neural denoising methods are incapable of handling the variable number of depth bins in deep-Z images. Our method extends kernel-predicting convolutional neural networks to address the challenges stemming from denoising deep-Z images. We propose a hybrid reconstruction architecture that combines the depth-resolved reconstruction at each bin with the flattened reconstruction at the pixel level. Moreover, we propose depth-aware neighbor indexing of the depth-resolved inputs to the convolution and denoising kernel application operators, which reduces artifacts caused by depth misalignment present in deep-Z images. We evaluate our method on a production-quality deep-Z dataset, demonstrating significant improvements in denoising quality and performance compared to the current state-of-the-art deep-Z denoiser. By addressing the significant challenge of the cost associated with rendering path-traced deep-Z images, we believe that our approach will pave the way for broader adoption of deep-Z workflows in future productions.