I am generally interested in real time 3D techniques and how they can be implemented in interactive environments like Touchdesigner. One technique I was happy to wrangle in was trails, and I thought I’d talk a bit about it. The explanation assumes some intermediate knowledge of Touchdesigner and OpenGL.

The construction of trails includes a feedback element for storing curve information, followed by curve extrusion. Curve extrusion is a technique in computer graphics which converts a curve into a 3D object. It’s used to accurately model things like tubes, ribbons, or wire. It can be used in conjunction with particle systems as a way to render the path that a particle follows over time; it can also be used independently of particles, constructed from parametric equations or other means of defining space curvature.

Big thanks to Atagen for sharing his implementation, which I decided to base mine off of. Would recommend watching if you want to see how the system works in its entirety. What’s great about this technique is that it leverages instancing, which allows for parallelized computation.

I decided to take a similar approach, using feedback to store a buffer of texture values, then sampling this buffer in the material shader where the curve framing is handled. I wanted this technique to play well with particle simulations in particular, where I could throw the resulting texture of a particle system into it, and have it pump out the extruded trails of the particles’ paths in a variety of shapes and forms.

There were several features I added to the base technique that helped me with this.

  • Flat Shading
  • 3D Texture Buffer
  • Alpha Clipping

Flat Shading

Currently the computation of the surface normals in the vertex shader is setup such that they always point directly away from the center of the tube, which gives them a smoothed appearance. This is fine when the tube has a circular-ish cross section, but when the cross section has say three edges, we begin to notice the shading interpolation between vertices, and the visual result feels off. In order get better results with tubes that have more pronounced edges we can use flat shading.

Flat shading takes the surface of each primitive, and assigns the vector perpendicular to this surface as the normal for all the vertices in that primitive. This is a common scenario when using SOPs, in which context we would look to the Facet operator. But with this technique, we are handling our vertex information in the material shader, including the calculation of our surface normals, so I figured flat shading should happen somewhere in there.

Using the geometry shader worked well since vertex information is available per primitive at this stage. The vertex positions in a triangular primitive allow us to calculate a vector perpendicular to the plane they intersect; this becomes our new normal, now updated for shading.

vec3 faceNormal = -normalize(cross(gl_in[1].gl_Position.xyz - 
gl_in[0].gl_Position.xyz,
gl_in[2].gl_Position.xyz - gl_in[0].gl_Position.xyz));

3D Texture Buffer

The texture buffer is used to store all of the position information for all the curves. In order to know what information belongs to which curve, we determine an axis along which the curve will be defined; every pixel along that axis in the buffer will contain the information for one curve. I’ll try to illustrate this based on how it works in the base example.

Let’s say that this drawing represents a 1-D texture. This texture stores the current position for N number of particles P.

To include information about the particle’s trail, we construct a 2D buffer and use feedback to store the position information at previous frames.

Here, the y axis is dedicated to time, measured in frames. t(0) represents the current frame, and extends to M-1 frames in the past, with a total buffer length of M. Because we want a trail to contain the positions for a single particle across time, we define our curve C along the y axis.

In this N x M texture we can store information for N curves, each M frames in length. Since we define our curve along the y axis of time, we use the remaining axes to determine how many curves we can store. In this case, we use a 2-D texture, and determine the number of curves with the x axis, using its length N. This essentially limits us to using 1-D textures, and the resolution caps that come with them, when representing particle information. But often particle simulations are optimized to fit in square or square-ish 2-dimensional textures. This is where using a 3D Texture becomes handy.

Now, we define our particles on the x and y axis, and time along the z, with a depth of D. Defining our curves along the z axis allows us to use 2D textures to hold all the particle information at a given time step.

This way we can store information for more particles, and more curves, in a way that is flexible with input dimensions. I ended up choosing 3D Textures over a 2D Texture Array to allow for interpolation between depth slices. Using Compute Shaders to construct the 3D buffer ensures good performance when texture sizes start getting large.

Alpha Clipping

Having implemented a buffer that handles 2D textures, I decided to test it out with a simple particle system. This is what I get

Not bad, but we notice these flickering dark portions of the trail; these are created because our particle has respawned somewhere within our frame buffer. Our shader creates a connected tube along the positions of a particle across all time frames. Even if there is a significant discontinuity in the position along these time frames, in this case when our particle respawns, the material shader will still treat the points as as single curve, and render the connection between the “death” point and spawn point.

This could be considered a feature but lets say we want to remove these guys. We would essentially want to clip the tube between the point where the particle dies and where the particle respawns. In this way the trails will end where particles die and begin again where particles are born.

To isolate this portion of the tube we need access to the life attribute of the particles at every point along their trail. We can do this by running the current life attributes into another feedback buffer, and sample this life buffer in the material shader along with the position buffer.

Lets take a look at how this works in a 2-D buffer scenario

In this example we have a life buffer for a system of 2 particles, with a buffer length of 5 frames. At every frame, a particle’s life is decremented by 0.1, when it reaches 0, the particle’s life is reset to 1 and is respawned.

This means that when a particle is alive, it’s life is always decreasing as time increases. The only time a particles life will increase is when it is re-spawned. We can use this information to check if a re-spawn has occurred between two frames.

We compare life values between frames by sampling the neighbors of the current pixel in the y direction. In the case that no respawn has occurred, the up neighbor should have a value greater than the current pixel, and the down neighbor should have a value less. When either of these conditions fail, we know that the current pixel lives at either the “death” point or spawn point.

 

Great, so we’ve isolated the part of tube where the discontinuity lies, but how do we remove it from the render? This is where the idea of alpha clipping comes into play

The vertex shader is where we sample our buffers and check if we have hit a discontinuity point, but we cannot discard vertices in the vertex shader; however, we can discard fragments in the pixel shader. So we need a way for the vertex shader to tell the pixel shader when we have hit a point on the tube that we do not want to render. We do this by setting the alpha to zero whenever we hit a discontinuity point, when the pixel shader receives color information with an alpha value of zero, it will know to discard the fragment. By setting the alpha at both discontinuity points to zero, the interpolated alpha values between them should also become zero, and the entire section will be flagged for discard.

float life = texelFetch(sLifeMap, id, 0).r;
float lifeNext = texelFetch(sLifeMap, idNext, 0).r;
float lifeBefore = texelFetch(sLifeMap, idBefore, 0).r;

if(lifeBefore > life || lifeNext < life) {
color.a = 0.0;
}

The material operators in Touchdesigner have a very handy feature in the common page which allows us to tell the pixel shader to discard fragments that fall above or below a given alpha threshold. Playing with this value and the vertex resolution of the base geometry provided a much better result.

The ability to discard based on alpha gives us a lot of control over which sections of trail we want to render.

Variations

There are lots of possible variations of this technique but to describe a few

Instancing on Curve Points

Given that we have access to the points along each curve, we can instance objects at these points. We can bypass the curve extrusion all together or combine these instances with the extruded curve.

Manipulating Position Data

We can filter our position data before running it through the feedback buffer to get some interesting effects. For example, we can use a Limit TOP to quantize our position values, which gives a more discrete trail.

Wrap Up

I hope you got something out of this extended rant on trails and curve extrusion, and that some of these techniques help with improving your own 3d tool kit. When implemented correctly, this technique should play real nice with all sorts of setups, so experimenting is highly encouraged.