Interactive GLSL - Shader Introduction

-2018.2.26 7:00pmIntro to GLSL

Shaders are easily one of my favorite parts of game development! I could spend all day writing little shader programs that do various different effects, and I could probably spend just as much time staring at the results! Shaders are delightful. But some people seem to think they're ominous black boxes!

It's true that shaders are pretty opaque up front! What do all these extra words mean, why is the syntax so strange, where does the data come from, what on earth is going on here?? Diving in headfirst can be confusing.

Fortunately, shaders are quite simple at their core! And once you get past the initial hurdle, the rest is generally quite smooth sailing. So I'll take a stab at demystifying shaders through example! I've chosen GLSL (OpenGL Shader Language) as my shader language of choice, because it runs quite easily on the web. But the same principles also apply to HLSL (High Level Shader Language, for DirectX) or Cg (C for Graphics, Nvidia's deprecated language used prominently by Unity)!

The Simplest Shader

Let's start with a particularly simple shader! This one does little more than take a mesh, and draw it on the screen as a solid color.

Let's dissect it!

So first of all, you may have noticed two tabs here: Vertex Shader, and Fragment Shader! A "Shader Program" (aka Shader) is divided into multiple sub-shaders, typically a Vertex Shader and a Fragment Shader, but in more advanced cases, could include something like a Geometry Shader as well. Each sub-shader is a chunk of code that runs on the GPU, and has a specific role to fill!

The Vertex Shader's job is to get the vertex data from your 3D model, and put it at a 2D location on the screen! It's called on every vertex sent to the graphics card while this shader program is active.

The Fragment Shader's (also known as a Pixel Shader!) job is to determine the color of a pixel on the screen! It's called on every pixel inside of the mesh's triangles. The data passed into it is a little less straightforward! The data is actually blended between each of the 3 vertices that make up the current triangle.

The reason the correct term is "Fragment Shader" instead of "Pixel Shader", is that sometimes a single pixel on the screen is actually composed of multiple pixel fragments! This is especially the case in some forms of anti-aliasing, where a single pixel can be composed of many pixel fragments blended together!

The comments should clear up most of the code, but since this is the core of what's happening in the shader, I'd like to talk specifically about transforming a vertex from one space to another!

Space Transformations

Now, I'm no math person, so I'm not going to delve deep into what a matrix is, but I will say this: a matrix is a 4x4 grid of numbers that can be used to represent a collection of different vertex transformations. In this particular demo, "uWorldMatrix" is a matrix that represents 3 transformations: scaling by 1.5, rotating around the Y axis over time, and translating by (0,0.25,0)!

The current vertex location "aPosition" begins in model space, where each vertex is relative to the origin of the model. You can think of this as where your model is placed when you look at it in your 3D modeling software! Multiplying this model space location by "uWorldMatrix" will take it from model space, and move it into world space! World space is where an object is located in the world: lying on a table, sitting on a tree, or at an xyz of (0,0.25,0) with a scale of 1.5! If you remove the world space transform code "uWorldMatrix *" from the example, you can see the model where it would be in model space!

Similarly, view space, (also known as camera space), transforms everything so that the camera ends up situated at (0,0,0) looking down the Z axis, and everything else maintains their location relative to the camera. Projection space is view space then flattened onto a 2D surface, your screen.

Try taking out different matrix multiplications, and see what it looks like when you skip a space transform!

Texturing

Lets add in a few more things! A single color isn't much fun to look at, and textures are always supremely useful, so we'll start with that.

As you can see, we haven't added a whole lot of code here! We've added UV coordinates to our Vertex Shader, and you can see that we're just passing it through to the Fragment Shader with no modifications! Just taking the UV coordinate from the mesh attribute, and storing it in the varying so the GPU can interpolate it across the triangle for the fragment shader to use.

On the Fragment Shader, you can see that we're using the texture2D function to grab a texel (texture pixel!) from the sampler, and we're just outputting it as is. This is essentially how an unlit shader works, just grab a color from the texture, and call it a day!

Waving

All this may seem pretty rote and silly! If this is all we're doing, why did they bother making us do this stuff? Never fear, here's a taste of some fun things that can be done with shaders too!

This is a simple technique for waving meshes around! I use it often to acheive grass-like swaying motion, it's really easy to implement, and adds some inexpensive animation into your scene. Doing an effect like this in the shader is so cheap for the GPU, it's essentially free!

The first thing I did, was separate out the world position! Instead of combining all the space transformations onto the same line as before, we calculate just the world space position first! This lets us use the world position as a source to calculate our swaying, and apply the sway to our position in world space before passing it along!

This technique works by picking an angle, and pushing the vertex in the direction of that angle! We can convert an angle into a direction easily using sin/cos: the direction is [x:cos(angle), y:0, z:sin(angle)].

The angle itself is pulled from uTime, so that the angle animates over time! But uTime is boring by itself, try changing wobbleOffset to 0.0 to see what uTime's angle looks like alone! Boring, right? You can barely even tell it's doing anything! Everything is being offset by the same angle, and that doesn't create any variation.

So we need a starting angle that's different in different locations! That's what wobbleOffset is, an initial angle pulled from the world location. Reset the shader, and remove "uTime +" this time. You can see that the distortion is there, and it does actually change a bit! The rotation of the object changes the world position, so we'll get a small change in the initial angle from that. So the initial angle from wobbleOffset combined with the animation of uTime gives us a nice, interesting wobble effect!

Try messing with some of the numbers, and see if you can tweak it to something you like a little more!

More

I hope you've enjoyed this, and found it enlightening! If you have any questions, or need anything clarified extra, please don't hesitate to let me know!

Check out part 2, where we continue on into some basic lighting!