Conditional-Flow NeRF:
Accurate 3D Modelling with Reliable Uncertainty Quantification
ECCV 2022

Abstract

overview

We propose Conditional-Flow NeRF (CF-NeRF), a novel probabilistic framework to incorporate uncertainty quantification into NeRF-based approaches. We learn an adaptive distribution over all possible radiance fields modelling able to quantify both 2D (RGB) and 3D (Depth) uncertainty associated with the modelled scene.

Visualization of the Learned Scene Representation

Methodology

overview

Illustration of our pipeline for the inference and computation of the log-likelihood on the pixel color. (a) We sample a set of variables z from the global latent distribution. (b) Given each spatial location x along a camera ray with viewing direction d, we can generate a set of density and radiance values by passing each z variable through our proposed CNF. (c) These values can be represented as a set of different density-radiance trajectories along the ray corresponding to each z, followed by volume rendering techniques to composite each trajectory into a RGB value. Finally, these RGB values are used to compute the log-likelihood for the pixel color and also estimate the model prediction and its associated uncertainty using their mean and variance during inference.

BibTeX

@inproceedings{CF-NeRF,
title={Conditional-Flow NeRF: Accurate 3D Modelling with Reliable Uncertainty Quantification},
author={Jianxiong Shen and Antonio Agudo and Francesc Moreno-Noguer and Adria Ruiz},
booktitle={ECCV},
year={2022}}

Acknowledgements

The website template was borrowed from Michaƫl Gharbi.