Login | Register

Extreme Views: 3DGS Filter for Novel View Synthesis from Out-of-Distribution Camera Poses

Title:

Extreme Views: 3DGS Filter for Novel View Synthesis from Out-of-Distribution Camera Poses

Bowness, Damian (2025) Extreme Views: 3DGS Filter for Novel View Synthesis from Out-of-Distribution Camera Poses. Masters thesis, Concordia University.

[thumbnail of Bowness_MA_F2025.pdf]
Preview
Text (application/pdf)
Bowness_MA_F2025.pdf - Accepted Version
Available under License Spectrum Terms of Access.
19MB

Abstract

3D reconstruction is a foundational component in robotics and autonomous systems, enabling machines to perceive and interpret their environment for tasks such as navigation, obstacle avoidance, and motion planning. As these systems increasingly operate in real-time and dynamic environments, the need for efficient and robust scene understanding becomes paramount. Among recent innovations, 3D Gaussian Splatting (3DGS) has gained attention for its ability to reconstruct photorealistic 3D scenes with high efficiency and support for real-time novel view synthesis. Unlike traditional mesh- or voxel-based representations, 3DGS models scenes using a compact set of anisotropic Gaussians, each carrying spatial and appearance information, which are then splatted into the image plane during rendering.

However, a persistent challenge in such learned representations is uncertainty due to insufficient, ambiguous, or occluded data in the input views i.e. epistemic uncertainty. This can result in visual artifacts, especially when rendering novel viewpoints that diverge significantly from the training set. Existing methods such as BayesRays, designed for NeRF-based models, address this issue via probabilistic ray-based sampling and post-hoc filtering, but require retraining.

In this work, we propose a gradient sensitivity-based filtering framework for 3DGS that mitigates epistemic artifacts in real-time without the need for model retraining. Specifically, we introduce a novel sensitivity score that quantifies the directional gradient of pixel color with respect to spatial perturbations at the point of ray-Gaussian intersection. This score captures the local instability in the rendering process caused by insufficient coverage or ambiguity in training views. By computing this score directly within the existing rendering pipeline, we enable on-the-fly filtering of Gaussians whose contributions are deemed unstable or unreliable.

Our approach can be applied to any pre-trained 3DGS model, making it highly practical for deployment in real-time systems. We evaluate our method on challenging indoor and outdoor scenes, including those from the Deep Blending and NeRF-On-the-Go datasets, and show that it effectively suppresses rendering artifacts. Notably, our filtering substantially improves visual quality, realism, and consistency compared to BayesRays, while avoiding the overhead of additional training or scene-specific tuning. This makes our method particularly suited for robotics and AR/VR applications where rapid adaptation and robustness to viewpoint changes are critical.

Divisions:Concordia University > Gina Cody School of Engineering and Computer Science > Computer Science and Software Engineering
Item Type:Thesis (Masters)
Authors:Bowness, Damian
Institution:Concordia University
Degree Name:M. Comp. Sc.
Program:Computer Science
Date:14 May 2025
Thesis Supervisor(s):Poullis, Charalambos
ID Code:995649
Deposited By: DAMIAN BOWNESS
Deposited On:04 Nov 2025 15:34
Last Modified:04 Nov 2025 15:34
All items in Spectrum are protected by copyright, with all rights reserved. The use of items is governed by Spectrum's terms of access.

Repository Staff Only: item control page

Downloads per month over past year

Research related to the current document (at the CORE website)
- Research related to the current document (at the CORE website)
Back to top Back to top