DEV Community

Yury Samkevich
Yury Samkevich

Posted on

Learn OpenGL with Rust: textures

Welcome to the fourth part of Learn OpenGL with Rust tutorial. In the last article we've learned what vertex buffer and vertex array objects are and how to use them and shaders to render simple primitives.

In this article we will explore what textures are, how to load texture from an image file and render it to the screen. All the source code for the article you can find on github by the following link.

Textures

So far we rendered a triangle and even painted it with a different colors. But what if we want to render some more realistic image, let's say Ferris, Rust's unofficial mascot. It would take significant amount of time and resources to recreate image with vertices and specify color attribute for each vertex.

Texture is usually used in computer graphics for that purpose. A texture is generally an image used to add detail to an object. We can insert a lot of detail in an image and give the illusion the object is extremely detailed with no need to add extra vertices. OpenGL can use different type of textures: 1D, 3D, cube textures etc. In this article, we will concern ourselves only with two-dimensional textures.

Like VBOs and VAOs, textures are objects in OpenGL and require a generated id before we can use them. We create a dedicated type for a texture:

pub struct Texture {
    pub id: GLuint,
}
Enter fullscreen mode Exit fullscreen mode

To generate a new texture id we use gl::GenTextures function:

impl Texture {
    pub unsafe fn new() -> Self {
        let mut id: GLuint = 0;
        gl::GenTextures(1, &mut id);
        Self { id }
    }
}
Enter fullscreen mode Exit fullscreen mode

In order to delete texture resources once we don't need them anymore we implement Drop trait for texture type:

impl Drop for Texture {
    fn drop(&mut self) {
        unsafe {
            gl::DeleteTextures(1, [self.id].as_ptr());
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

Just like other objects in OpenGL, textures have to be bind to apply operations to them. Since images are 2D arrays of pixels we use gl::BindTexture function with gl::TEXTURE_2D as the first parameter:

impl Texture {
    pub unsafe fn bind(&self) {
        gl::BindTexture(gl::TEXTURE_2D, self.id)
    }
}
Enter fullscreen mode Exit fullscreen mode

Render a rectangle

In order to draw a texture we first need to learn how to draw a rectangle. We can draw a rectangle using two triangles, because OpenGL mainly works with triangles:

Image description

As you can see, some vertices of triangles are overlapped. Imagine if we'd wanted to draw a more complex model with thousands of triangles, there will be large chunks that overlap. What would be a better solution is to store only the unique vertices and then specify the order at which we want to draw these vertices in.

Element buffer objects (EBO) are intended to help us solving the problem. An EBO is a buffer that stores indices that OpenGL uses to decide what vertices to draw. We will reuse our Buffer implementation that we defined in previous lessons with the gl::ELEMENT_ARRAY_BUFFER type.

To get started we first have to specify the vertices and the indices according to the above picture:

type Pos = [f32; 2];

#[repr(C, packed)]
struct Vertex(Pos);

#[rustfmt::skip]
const VERTICES: [Vertex; 4] = [
    Vertex([-0.5, -0.5]),
    Vertex([ 0.5, -0.5]),
    Vertex([ 0.5,  0.5]),
    Vertex([-0.5,  0.5]),
];

#[rustfmt::skip]
const INDICES: [i32; 6] = [
    0, 1, 2,
    2, 3, 0
];

Enter fullscreen mode Exit fullscreen mode

Next we need to create the element buffer object and copy the indices into the buffer:

let index_buffer = Buffer::new(gl::ELEMENT_ARRAY_BUFFER);
index_buffer.set_data(&INDICES, gl::STATIC_DRAW);
Enter fullscreen mode Exit fullscreen mode

The last thing left to do is to replace the gl::DrawArrays call with gl::DrawElements to indicate we want to render the triangles from an index buffer, specifying that we want to draw 6 vertices in total:

gl::DrawElements(gl::TRIANGLES, 6, gl::UNSIGNED_INT, ptr::null());
Enter fullscreen mode Exit fullscreen mode

Texture coordinates

Now when we know how to draw a rectangle let's look how we can use our knowledge to render a texture. First we need to map a texture to the rectangle and tell each vertex of the rectangle which part of the texture it corresponds to. Therefore, each vertex should have a texture coordinate associated with it. These coordinates range from 0.0 to 1.0 where (0,0) is conventionally the bottom-left corner and (1,1) is the top-right corner of the texture image. The new vertex array will now include texture coordinates for each vertex:

type Pos = [f32; 2];
type TextureCoords = [f32; 2];

#[repr(C, packed)]
struct Vertex(Pos, TextureCoords);

#[rustfmt::skip]
const VERTICES: [Vertex; 4] = [
    Vertex([-0.5, -0.5],  [0.0, 1.0]),
    Vertex([ 0.5, -0.5],  [1.0, 1.0]),
    Vertex([ 0.5,  0.5],  [1.0, 0.0]),
    Vertex([-0.5,  0.5],  [0.0, 0.0]),
];
Enter fullscreen mode Exit fullscreen mode

Next we need to alter the vertex shader to accept the texture coordinates as a vertex attribute and then forward the coordinates to the fragment shader:

#version 330
in vec2 position;
in vec2 vertexTexCoord;

out vec2 texCoord;

void main() {
    gl_Position = vec4(position, 0.0, 1.0);
    texCoord = vertexTexCoord;
}
Enter fullscreen mode Exit fullscreen mode

The fragment shader should then accept the texCoord output variable as an input variable:

out vec4 FragColor;

in vec2 texCoord;

uniform sampler2D texture0;

void main() {
    FragColor = texture(texture0, texCoord);
}
Enter fullscreen mode Exit fullscreen mode

The fragment shader declares a uniform sampler2D that we later assign our texture to. To sample the color of a texture we use GLSL's built-in texture function that returns a color of the texture at the texture coordinate.

Load texture data

Now that the texture object has been configured it's time to load the texture image. Texture images can be stored in dozens of file formats, each with their own structure and ordering of data. Likely for us there is a crate image which can handle everything for us, just add it as a dependency in Cargo.toml file.

In order to load texture data we first load an image by the given path. Then we convert the image in a general RGBA format and upload the image data to GPU using gl::TexImage2D:

impl Texture {
    pub unsafe fn load(&self, path: &Path) -> Result<(), ImageError> {
        self.bind();

        let img = image::open(path)?.into_rgba8();
        gl::TexImage2D(
            gl::TEXTURE_2D,
            0,
            gl::RGBA as i32,
            img.width() as i32,
            img.height() as i32,
            0,
            gl::RGBA,
            gl::UNSIGNED_BYTE,
            img.as_bytes().as_ptr() as *const _,
        );
        Ok(())
    }
}
Enter fullscreen mode Exit fullscreen mode

Now, when we put all the pieces together, rendering initialisation code will look like the following:

let vertex_shader = Shader::new(VERTEX_SHADER_SOURCE, gl::VERTEX_SHADER)?;
let fragment_shader = Shader::new(FRAGMENT_SHADER_SOURCE, gl::FRAGMENT_SHADER)?;
let program = ShaderProgram::new(&[vertex_shader, fragment_shader])?;

let vertex_array = VertexArray::new();
vertex_array.bind();

let vertex_buffer = Buffer::new(gl::ARRAY_BUFFER);
vertex_buffer.set_data(&VERTICES, gl::STATIC_DRAW);

let index_buffer = Buffer::new(gl::ELEMENT_ARRAY_BUFFER);
index_buffer.set_data(&INDICES, gl::STATIC_DRAW);

let pos_attrib = program.get_attrib_location("position")?;
set_attribute!(vertex_array, pos_attrib, Vertex::0);
let color_attrib = program.get_attrib_location("vertexTexCoord")?;
set_attribute!(vertex_array, color_attrib, Vertex::1);

let texture = Texture::new();
texture.load(&Path::new("assets/ferris.png"))?;

gl::BlendFunc(gl::SRC_ALPHA, gl::ONE_MINUS_SRC_ALPHA);
gl::Enable(gl::BLEND);
Enter fullscreen mode Exit fullscreen mode

In two last lines we set alpha blending function. This is a topic for another discussion and for now is enough to know that we use it to set the rules how to combine an image with a background.

The code in the rendering loop:

gl::ClearColor(0.5, 0.5, 0.5, 1.0);
gl::Clear(gl::COLOR_BUFFER_BIT);
texture.bind();
program.apply();
vertex_array.bind();
gl::DrawElements(gl::TRIANGLES, 6, gl::UNSIGNED_INT, ptr::null());
Enter fullscreen mode Exit fullscreen mode

If you compile and run the code with cargo run you should see the following image:

Image description

Congratulation! You just rendered Ferris!

Texture units

What if we want to use several textures at the same time? You might noticed that we declared uniform sampler2D in the fragment shader. We actually can assign a location value to the texture sampler using that uniform parameter. The location of a texture is more commonly known as a texture unit.

We have to tell OpenGL to which texture unit each shader sampler belongs to by setting each sampler. We add set_int_uniform function for ShaderProgram type:

impl ShaderProgram {
    pub unsafe fn set_int_uniform(&self, name: &str, value: i32) -> Result<(), ShaderError> {
        self.apply();
        let uniform = CString::new(name)?;
        gl::Uniform1i(gl::GetUniformLocation(self.id, uniform.as_ptr()), value);
        Ok(())
    }
}
Enter fullscreen mode Exit fullscreen mode

Now we can load two textures and pass their texture units to the fragment shader:

let texture0 = Texture::new();
texture0.load(&Path::new("assets/logo.png"))?;
program.set_int_uniform("texture0", 0)?;

let texture1 = Texture::new();
texture1.load(&Path::new("assets/rust.jpg"))?;
program.set_int_uniform("texture1", 1)?;
Enter fullscreen mode Exit fullscreen mode

We can activate texture unit using gl::ActiveTexture and passing in the texture unit we'd like to use. For that we add activate function for Texture type:

impl Texture {
    pub unsafe fn activate(&self, unit: GLuint) {
        gl::ActiveTexture(unit);
        self.bind();
    }
}
Enter fullscreen mode Exit fullscreen mode

So we can activate our textures before drawing in a render loop:

texture0.activate(gl::TEXTURE0);
texture1.activate(gl::TEXTURE1);
Enter fullscreen mode Exit fullscreen mode

We also need to edit the fragment shader to accept another sampler:

#version 330
out vec4 FragColor;

in vec2 texCoord;

uniform sampler2D texture0;
uniform sampler2D texture1;

void main() {
    vec4 color0 = texture(texture0, texCoord);
    vec4 color1 = texture(texture1, texCoord);
    FragColor = mix(color0, color1, 0.6) * color0.a;
}
Enter fullscreen mode Exit fullscreen mode

The final output color is now the combination of two textures.

Image description

For example the mix of these two textures you can see on the following screenshot:

Image description

Summary

Today we've learned what textures are and how to load them from an image file and render to the screen. We've rendered Ferris 🦀 and explored how to use textures in order to create some simple graphic effects.

If you find the article interesting consider hit the like button and subscribe for updates.

Top comments (2)

Collapse
 
ferceg profile image
ferceg

Thank you, I can't wait for the next chapter.

Collapse
 
arctique profile image
Arctique

Looking forward to the next chapter