Why "Fake" Relighting?

A Simple Explanation of Real-Time Dynamic Lighting for Gaussian Splatting

Daniel Skaale

The Problem

Gaussian Splatting represents 3D scenes as millions of colored "splats" (like oriented paint blobs). Each splat has:

The colors are "baked in" from the original camera capture—they can't respond to new lights you add to the scene.

Why It's Hard

Traditional 3D lighting needs surface normals (arrows pointing perpendicular to surfaces) to calculate how light hits objects. For example:

But Gaussian Splats are billboards—flat quads that always rotate to face the camera. They don't have real surface geometry or normals.

The "Fake" Relighting Trick

Instead of computing real surface normals (which don't exist), I use a clever approximation:

View Direction = Surface Normal

Effective Normal = normalize(CameraPosition - SplatPosition)

The key insight: Since splats always face the camera AND represent surfaces that were visible during capture, the direction from the splat to the camera is a reasonable guess for the surface orientation.

How It Works (3 Steps)

1. Compute fake normal: Use view direction as if it were a surface normal

2. Calculate lighting: Use standard lighting math (Lambert shading)

Brightness = max(0, normal · lightDirection)

3. Apply attenuation: Lights fade with distance using squared falloff

Attenuation = 1 - (distance/range)²

Why "Fake"?

The Trade-off

What you get:

  • ✓ Dynamic lights that move in real-time
  • ✓ Point lights, spot lights, directional lights
  • ✓ Shadows and ambient occlusion integration
  • ✓ Fast enough for VR (<1ms per 10 lights)

What you don't get:

  • ✗ No specular highlights (shiny reflections)
  • ✗ No accurate surface microdetail
  • ✗ Not suitable for materials that require precise normals

In One Sentence

"I fake lighting on Gaussian Splats by pretending the view direction is the surface normal, which gives plausible real-time dynamic lighting without needing actual geometry."