Gradients Are the New Intervals by surprisetalk
At the New England Symposium on Graphics,
James Tompkin compared graphics researchers to
magpies: they’re easily distracted by shiny objects and pretty renderings.
While this is true, the analogy also holds from a different angle: when I’m
reading graphics papers, I’m constantly looking for ideas to steal
bring back to my nest.
Researchers at IRIT and
Adobe Research recently published a
paper
that’s full of interesting ideas, and I’d like to talk about it.
This blog post assumes a vague understanding of implicit surface rasterization,
and how interval arithmetic is used to both skip regions of space and simplify
complex expressions.
See this section of the Fidget writeup for a short
introduction, or
my colloquium talk
for a longer explanation.
Here’s the key sentence from the paper’s abstract:
We introduce an efficient hierarchical tree pruning method based on the
Lipschitz property of SDFs, which is compatible with hard and smooth CSG
operators.
In this case, “the Lipschitz property” means that the gradient of the distance
value is bounded, e.g. $||nabla f(x, y, z)|| lt 1$. You’ll also see it
referred to as
Lipschitz continuity.
Lipschitz continuity provides a bound on the maximum change in the distance
value between two points: if you have points $vec{p}$ and $vec{q}$ in 3D
space, then
$$|f(vec{p}) – f(vec{q})| le ||vec{p} – vec{q}||$$
Given this requirement on the distance fields, the authors present two core
tricks that can be applied to CSG-tree-flavored models:
- Pruning identifies primitives which are inactive within a particular
region of space, and builds a simplified shape with only active primitives - Far-field culling finds primitives which are far from a particular
region (and therefore not relevant), and replaces them with a simpler
expression.
These optimizations make rendering dramatically cheaper, allowing for
interactive visualizations of complex models.
The first optimization also sounds familiar – I presented something
similar back in 2020 (which
itself builds on work from 1992):
To render a model, its underlying expression is evaluated in a shallow
hierarchy of spatial regions, using a high branching factor for efficient
parallelization. Interval arithmetic is used to both skip empty regions and
construct reduced versions of the expression. The latter is the optimization
that makes our algorithm practical: in one benchmark, expression complexity
decreases by two orders of magnitude between the original and reduced
expressions.
Here’s the big difference: my implementation used
interval arithmetic
on arbitrary math expressions, while this new paper uses single-point evaluation
on Lipschitz-continuous primitives.
Single-point evaluation is cheaper, and also doesn’t suffer from the
conservative nature of interval arithmetic (where intervals tend to grow over
time). However, their shapes are limited to a set of well-behaved primitives
and transforms; something as simple as scaling a model can break Lipschitz
continuity if not handled carefully.
After thinking about it for a few days, I ended up taking home a slightly
different conclusion than the paper presented:
If your distance field is Lipschitz-continuous, then you can use single-point
evaluation to get interval-ish results, then apply all the usual tricks which
normally require interval arithmetic.
I’ve now gone far too long without showing a pretty picture (the magpies in the
audience are growing rowdy), so let’s dig into some examples. I’m going to
take this idea — of using single-point evaluations to get pseudo-intervals — and
see how it applies to my usual box of tools.
Note that I’m not hewing to the implementation from the paper (which you should
go read, it’s great!); I’m taking this idea and running in a slightly different
direction based on my background knowledge.
Basic rendering
Here’s a circle of radius 1, $f(x, y) = sqrt{x^2 + y^2} – 1$:
It’s rendered with orange / blue indicating whether we’re inside or outside the
shape, and field lines showing the distance field values. I’m not sure who
developed the original coloring strategy, but it’s pervasive on Shadertoy; I
borrowed it from Inigo Quilez.
My typical strategy for rendering is to do several rounds of evaluation using
interval arithmetic, then pixel-by-pixel evaluation of the remaining ambiguous
regions. The goal is to produce a binary image: each pixel is either inside or
outside, depending on sign.
Interval evaluation takes intervals for $x$ and $y$, and returns an interval
output.
For example, we can evaluate a region in the top-left quadrant:
$$f([0.5, 1.5], [0.5, 1.5]) = [-0.3, 1.12]$$
Because the interval result is ambiguous (contains 0), we can’t prove this
region inside or outside the shape, so we subdivide and recurse. If it was
unambiguously less than or greater than zero, we could stop processing right
away!
The full algorithm produces a set of colored regions (proven inside / outside),
and pixel-level results for ambiguous regions below a certain size:
Intervals are shown as large uniformly-shaded squares; pixel-by-pixel evaluation
is concentrated at the edge of the circle.
Using gradients instead
Our circle model has a gradient of 1 everywhere (proof is left as an exercise to
the reader).
If we want to evaluate an interval region
$[x_text{mi
6 Comments
kragen
This is very exciting! It seems like a lot of the interval stuff is bringing to fruition my idle speculations from 6 years ago in https://dercuano.github.io/notes/interval-raymarching.html. I'll have to read this carefully to see if it's actually the same approach or a different one that uses the same algebras.
fph
One of the benefits of intervals is that you can ensure your results are correct irrespective of floating-point errors, if you carefully round all computations in the correct direction.
I don't think you can ensure that with gradients though: if f and f' are computed in machine arithmetic, cancellation errors might pile up.
3abiton
It started interestingly, but then
> This blog post assumes a vague understanding of implicit surface rasterization, and how interval arithmetic is used to both skip regions of space and simplify complex expressions.
Can anyone give me a quick rundow of the article?
constantcrying
>In this case, "the Lipschitz property" means that the gradient of the distance value is bounded
This is total nonsense. The point of Lipschitz continuity is that it is more than continuity and less then differentiability. If you assert that it is differentiable the concept looses all meaning. It is specifically interesting because you do not have to assert differentiability.
yorwba
The suggested normalization procedure, even with the ad-hoc fix for gradient discontinuities, doesn't actually ensure that the resulting function is 1-Lipschitz unless the gradient of the gradient magnitude vanishes. The signed-distance functions considered in the article seem to have piecewise constant gradient magnitudes (so are L-Lipschitz, just with L > 1) except for inside the "r", but for less well-behaved functions, higher order derivatives might start to matter.
diabllicseagull
I've worked on a patent some years ago about SDF CSG Tree pruning and constant radius filleted blends. Sadly patents don't get the same visibility journals enjoy.
https://patentimages.storage.googleapis.com/7a/73/2d/8d2eeca…