DEV Community

Yuval
Yuval

Posted on • Updated on • Originally published at thecodingnotebook.com

Understanding the WebGL triangle

Let's try to understand the Web/OpenGL triangle - aka OpenGL Hello World! We will use WebGL for simplicity, but the principles are quite similar to OpenGL.

Alt Text

YouTube Version

The END

Let's start from the end. If you haven't heard it by now, OpenGL is a "state machine", practically what it means is that instead of calling to a function (like drawTriangle(...)) we will first have to setup parameters on the gl context (as if we adjust knobs on a machine), and then when the "state" is all set for our task, we will call the draw method.

The draw method

The method we will use for drawing is drawArrays(mode, first, count), from the docs it "renders primitives from array data".

Parameters:

  • mode - The primitive type to draw. We learn from this that OpenGL can draw specific primitives, like: POINTS, LINES, TRIANGLES (and some more). we will use TRIANGLES.
  • first - The starting index in the array of vector points. From this we learn we will have to setup some "array with vector" points in our "machine". Specifically these will be array with the corners of our triangle.
  • count - Number of indices to be rendered (from the above array). Since we are rendering a triangle we will need 3 indices (representing 3 vertices).

OK, so now that we know what is required for the draw method, we an go ahead and see how to set it all up...

OpenGL rendering pipeline

The rendering pipeline is the process of rendering an image to the screen (or any output surface), below is a very simplistic schema of the rendering pipeline:
Alt Text

  1. It all starts with vertices we provided that represent the primitive we want to draw. Note that the vertices can be in any coordinate system we want, in the image above we choose the origin to be (0, 0) in a normalized space (all coords in range 0 - 1).
  2. These vertices are then passed to the vertex shader, which is a program that we write and will run on the GPU for every vertex we provide (meaning 3 times in our case). The purpose of this method is to transform each vertex to OpenGL coordinate system, which is a normalized coordinate system with the origin at (-1, -1) (of course you can also choose to do extra manipulation on the vertex, like flip or rotate etc).
  3. The next step is Rasterization - in this step OpenGL will "collect" all the fragments (pixels) that needs to be rendered.
  4. All these fragments (pixels) are now processed with the fragment shader, which is a program we write and will run on the GPU for every fragment, i.e A LOT, so be careful here performance wise. The purpose of this program is to assign a color for every fragment (pixel).
  5. And then the pipeline continue with preparing the final image to draw.

The shaders programs

As described above, the vertex/fragments programs will run on the GPU and the language in which we develop is GLSL (OpenGL Shading Language). The specific version we are going to use is ES 3.0 (ES is a flavour for Embedded Systems), full reference can be found here. Overall the syntax resembles c++.

Vertex Shader

As described above, the vertex program needs to transform each vertex (x, y) to OpenGL coord system, now it is totally up to us to choose whatever coord system WE want to use. If we choose the same coord system as OpenGL our vertex shader will have to do nothing but echo the input. But, having origin at (-1, -1) is not intuitive so we will work with origin at (0, 0) and our triangle coords will be:

   (0.5, 1)
  +---/\---+
  |  /  \  |
  | /    \ |
  |/      \|
  +--------+
(0,0)     (1, 0)
Enter fullscreen mode Exit fullscreen mode

Color interpolation

Another useful feature of the vertex shader, is that any output variable we define will get interpolated and passed on to the fragment shader. So if for the first corner (vertex) of our triangle we output the color red (rgb: 1,0,0), and for the second corner we output blue (rgb: 0,0,1), all the fragments (pixels) in between will get values between these two values.

Note: The shaders code is defined as a simple string, we will compile it later using WebGL context.

Vertex shader code

This is our vertex shader, code is commented with explanations:

const VERTEX_SHADER = `#version 300 es
  // Below are 2 input attributes
  // The values for these attributes will be provided
  // by us in the javascript "world"

  // vec2 is a vector with 2 values representing the
  // vertex coordinates (x/y).
  in vec2 a_position;

  // vec3 is a vector with 3 values representing the
  // vertex color (r/g/b).
  in vec3 a_color;

  // output color for this vertex,
  // OpenGL will interpolate these values automatically across all fragments
  out vec3 color;

  void main() {
    // convert coord from [0,1] space to [0,2] space
    vec2 zeroToTwo = a_position * 2.0;

    // convert coord from [0,2] space to [-1,1]
    vec2 glCoordSpace = zeroToTwo - 1.0;

    // set the output in the global predefined gl_Position variable.
    // Note a vertex has 4 values: x,y,z,w - we use 0,1 for z,w.
    gl_Position = vec4(glCoordSpace, 0, 1);

    // set the output color variable, just provide the user input
    color = a_color;
  }
`;
Enter fullscreen mode Exit fullscreen mode

Fragment Shader

The fragment shader just have to assign an rgba value for each pixel, quite simple:

const FRAG_SHADER = `#version 300 es
  // Need to the the engine what float precision we want
  precision highp float;

  // input color for this fragment, provided by OpenGL
  // interpolating the output color variable that is in the
  // vertex shader.
  in vec3 color;

  // We should set this output param with the desired
  // fragment color
  out vec4 outColor;

  void main() {
    // just output the color we got, note we got vec3 (rgb)
    // and outColor is vec4, so we use 1.0 for the alpha.
    outColor = vec4(color, 1.0);
  }
`;
Enter fullscreen mode Exit fullscreen mode

Creating the GL "executable" program

OK, in order to use WebGL we need a "gl" context, that we can get from a canvas element (onto which the output will be rendered as well):

<canvas id="canvas" width="300" height="300">
Enter fullscreen mode Exit fullscreen mode

Then we can get the "gl" context:

var canvas = document.getElementById('canvas');
// get webgl2 context
var gl = canvas.getContext('webgl2');
Enter fullscreen mode Exit fullscreen mode

Compile the shaders

Next we compile the shaders code:

// Create shader object of type "VERTEX"
const vertexShader = gl.createShader(gl.VERTEX_SHADER);

// Load the shader source
gl.shaderSource(vertexShader, VERTEX_SHADER);

// Compile the shader
gl.compileShader(vertexShader);

// Check the compile status
var compiled = gl.getShaderParameter(vertexShader, gl.  COMPILE_STATUS);
if (!compiled) {
  // Something went wrong during compilation; get the error
  const lastError = gl.getShaderInfoLog(shader);
  throw new Error('Error compiling shader: ' + lastError);
}

// Do the same for the fragment shader
const fragShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragShader, FRAG_SHADER);
gl.compileShader(fragShader);
// here we should check compilation status as well
Enter fullscreen mode Exit fullscreen mode

Link the gl program

Like any executable, our program should get linked too:

// create program object
var program = gl.createProgram();

// attach it with the 2 shaders
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragShader);

// and link
gl.linkProgram(program);

// Check the link status
const linked = gl.getProgramParameter(program, gl.LINK_STATUS);
if (!linked) {
  // something went wrong with the link
  const lastError = gl.getProgramInfoLog(program);
  throw new Error('Error linking gl program:', lastError);
}
Enter fullscreen mode Exit fullscreen mode

Yay, we have a GL program ready to Run!

Loading data to the vertex shader

In the sections above we said that we will provide the vertex shaders with the vertices to render, now is the time.

The vertex input buffer

If you look at our vertex shader you'll see it has 2 input attributes, a_position and a_color, let define the data for each.

First we create a "buffer", which is just a chunk of memory on the GPU:

var vertexParamsBuffer = gl.createBuffer();
Enter fullscreen mode Exit fullscreen mode

Then we should "bind" it, binding in OpenGL is like making something "active". Remember we said that OpenGL is a "state machine"? well, think of it as a "machine" with many slots each for a specific usage. So when we "bind" something it's as if we load an object into that dedicated slot. Here we bind our vertexParamsBuffer to the ARRAY_BUFFER slot.

From the docs, the ARRAY_BUFFER is "Buffer containing vertex attributes, such as vertex coordinates ... or vertex color data" - PERFECT!

gl.bindBuffer(gl.ARRAY_BUFFER, vertexParamsBuffer);
Enter fullscreen mode Exit fullscreen mode

Once our buffer is in the correct ARRAY_BUFFER "slot", we can load it with data (this copies bytes to the GPU):

var verticesData = new Float32Array([
  // coord      color
    0.0, 0.0,  1.0, 0.0, 0.0, // 1st vertex
    0.5, 1.0,  0.0, 1.0, 0.0, // 2nd vertex
    1.0, 0.0,  0.0, 0.0, 1.0  // 3ed vertex
]);

// First parameter is the "slot" onto which load the data
// Second parameter is the chunk of bytes
// Third parameter is a hint to OpenGL how this data used
gl.bufferData(gl.ARRAY_BUFFER, verticesData, gl.STATIC_DRAW);
Enter fullscreen mode Exit fullscreen mode

IMPORTANT!: Above we just uploaded a chunk of bytes into a buffer on the GPU, OpenGL still knows nothing about the meaning of the data, what we commented above as "1st vertex coord/color" etc is just for us, next we will tell OpenGL how to interpret the data.

Describing the buffer data

So the data we uploaded above should go into our 2 attributes, a_position and a_color, lets start with the first:

// Gets the index of the a_position input attribute
var a_positionIdx = gl.getAttribLocation(program, 'a_position');

// Fact that we declared "in vec2 a_position" attribute
// in our vertex shader doesn't mean it is "active",
// we have to enable it specifically:
gl.enableVertexAttribArray(a_positionIdx);

// And now we tell OpenGL how to read data from the buffer
// CURRENTLY bound to the ARRAY_BUFFER "slot" into the
// vertex attribute a_position
gl.vertexAttribPointer(a_positionIdx, 2, gl.FLOAT, false, 20, 0);
Enter fullscreen mode Exit fullscreen mode

Let's study the parameters of vertexAttribPointer:

  • index - The index of the vertex (shader) attribute we describe
  • size - The number of components for this attribute. since our a_position is a vec2, we use 2 components (that is, 2 float values).
  • type - The data type for each component, we used a Float32Array so it is FLOAT.
  • normalized - whether the input is a normalized int - FALSE
  • stride - Specify the number of bytes between the beginning of each vertex data. In our case EACH vertex has 2 floats for x/y, and 3 floats for r/g/b => total of 5 float == 20 bytes - so from the beginning of a specific vertex data, jump 20 bytes and you get to the beginning of the next vertex data.
  • offset - Specify the number of bytes, from the beginning of the vertex data, where the first component is. In our case the x/y components are at the beginning of the vertex data, so offset is 0.

Now we do the same to describe the color attribute:

var a_colorIdx = gl.getAttribLocation(program, 'a_color');
gl.enableVertexAttribArray(a_colorIdx);
gl.vertexAttribPointer(a_colorIdx, 3, gl.FLOAT, false, 20, 8);
Enter fullscreen mode Exit fullscreen mode

Lets describe the parameters here as well:

  • index - as above
  • size - a_color is a vec3 so it has 3 components
  • type - FLOAT
  • stride - like above
  • offset - From the beginning of the vertex data we count 2 float of x/y and then it's the color data, so 8 bytes.

Rendering

Once we setup all the vertex attributes we can ask OpenGL to draw!

// First we set our program as the active program on the gpu
gl.useProgram(program);

// Call the draw method
gl.drawArrays(gl.TRIANGLES, 0, 3);
Enter fullscreen mode Exit fullscreen mode

This takes us back to the beginning of our article, the drawArrays method, lets see if we understand now the parameters:

  • mode - Easy, we want to draw a triangle.
  • first - The starting index in the array of vector points. Remember our ARRAY_BUFFER slot? into which we loaded our vertices data? the data we described using vertexAttribPointer and the actual chunk of data we provided (the verticesData Float32Array) creates exactly 3 vertices in the ARRAY_BUFFER slot, and the index of the first vertex we want to draw is 0 !
  • count - Easy, we want to draw 3 vertices.

THE END

Top comments (0)