Wireframe – Issue 20, 2019

(nextflipdebug2) #1
wfmag.cc \ 37

Live coding and ray marching

Toolbox


 Here’s an example of
a ray-marched blob
sketch. I was messing
around with refraction,
giving life to
squishy diamonds.
 Here’s an example of
a live-coded shader
made on the spur of
the moment.

 Figure 2: Sphere tracing is
how the ray travels using SDF
functions. In the first two
iterations, the floor is the
closest object, but it doesn’t
hit the floor – it hits the sphere
on the third iteration.

-^ Casting a ray from every pixel on your
screen into that 3D scene
-^ Colouring that pixel the colour of whatever
the ray hits (or doesn’t hit)


Now, how we do this is where it gets
interesting. We’re going to be using a specific
version of ray marching called sphere tracing.
We describe the 3D scene using SDF functions



  • SDF stands for signed distance function,
    and for these purposes, they’re just distance
    functions. This is cool, because if you describe
    an object by not defining its position in space
    but rather its distance to a point, you can use
    mathematical equations to describe shapes.


For example, this is a sphere:

float s = length(position)-radius;

This is a wavy floor:

float f = position.y + sin(position.x) +
cos(position.z);

So we can tell the maximum distance a ray
can travel without hitting any object, but since
SDFs are directionless, the maximum distance is
a sphere, as you can see in Figure 2. This is why
it’s called sphere tracing. Let’s roll our sleeves
up and think about how this is going to be
implemented. First, some things to know about
the limitations and the features of the shader.
A fragment shader is executed simultaneously
per pixel, which in this case is the whole screen.
This is useful, because it’s how this type of
program uses the GPU’s parallelism; what’s less
helpful is that you can’t easily look at the pixel’s
neighbours or have mutable variables across
all pixels per frame. The input we assume we
have is the UV passed in from the vertex shader.
A UV is simply two floating-point numbers that
describe a pixel’s position on the screen, so the
bottom left-hand corner is (0,0) the top right is
(1,1), and the middle is (0.5,0.5). With UV and


FUN FACT
UV doesn’t stand for
anything: it’s named such
simply because XY was
already taken for the position
of an object in a 3D scene.

screen resolution as your inputs, your job as the
programmer is to output the colour of the pixel
at that UV position.
Let’s take a look at this idea implemented
in GLSL. All of the following code was written
with OpenGL Legacy context, and was tested in
OpenGL ES 2.0 context as well.

#ifdef GL_ES
precision highp float;
#endif

// These are defined by KodeLife, or whatever
// environment you are using.
uniform float time;
Free download pdf